Blog Archives

America’s Abundant Forests

America’s Abundant Forests

Q: Are America’s forests in danger???

A: Not at all. Because the United States practices reforestation, its forests have actually grown in size over the past century. About one-third of the United States — 747 million acres — is covered with trees. In fact, we have more trees today than we had 70 years ago. And some 4 million more are planted each day. On the nation’s commercial forests, net annual growth exceeds harvests and losses to insects and disease by an impressive 47 percent each year.

Q: How much forestland is actually used for producing timber?

A: 504 million acres of America’s forestland is classified as “timberland,” productive forests capable of growing 20 cubic feet of commercial wood per acre per year. But not all of that is used for timber production.

A portion of that is permanently managed for uses such as recreation, streamside protection, and wildlife. About 52 million acres of U.S. forestland – an area larger than the states of North and South Carolina combined- are set aside by law for non-timber uses, such as parks or wilderness areas. Of the 191 million acres of forestland contained within the National Forest System owned and managed by the federal government, only 49 million acres are available for forest management.

Q: Who owns the nation’s commercial forests?

A: Of the nation’s 504 million acres of timberlands, 146 million acres, or 29 percent, are owned by federal, state and local governments. Fifty-eight percent of these productive woodlands -some 291 million acres- are held by some 10 million individual private landowners. About 67 million acres, or 13 percent of the total commercial timberlands, are owned by the forest products industry.

Managing A Renewable Resource

Q: Why do forest landowners sometimes take all of the trees out of an area?

A: That harvesting system is called “clearcutting” – removing all of the trees from a stand rather than picking and choosing. When compared to other techniques, clearcutting is often the best method for environmental as well as economic reasons. Some seedlings won’t grow in shade, so removing all of the mature trees ensures that enough light can reach new seedlings. Sometimes something calamitous, like a fire or windstorm, a tree disease, or an insect epidemic, requires that damaged trees be removed so that new trees can get a fresh start. And clearcutting requires fewer roads, which minimizes the expense as well as the disturbance to the environment. In each case, the type of harvest method used is dictated by the type of tree being harvested, the soil and terrain, wildlife habitat and the conditions needed to start the next forest.

Q: Do timber companies replant when they cut?

A: Yes. Forest products companies are in the business of growing and harvesting trees, so reforestation is important to them. In fact, more than 91 percent of all trees planted in America during the turn of the 21st Century were planted by forest product companies and private timberland owners. And logging companies pay a special fee for replanting and reforestation when they buy the right to harvest timber on state or national forests.

In some regions of the country, nature itself replants very efficiently. Throughout the Northeast and Lake States for example, foresters often manage harvested areas to promote natural regrowth from sprouting and seeds.

Q: How many trees are planted each year?

A: In 1999, the forestry community planted some 1.7 billion trees in the United States. That’s an average of more than 4 million new trees planted every day – more than 5 new trees a year for every man, woman and child in America.

Caring For The Environment

Q: What is the forest products industry doing to protect environmental quality?

A: The nation’s forest products companies announced their continued commitment to the goal of sustainable forestry through the Sustainable Forestry Initiative® (SFI) Program. Developed by professional foresters, conservationists and scientists, the SFI sm Program combines the perpetual growing and harvesting of trees with the long-term protection of wildlife, plants, soil and water quality. There are currently 107.8 million acres of forestland in North America enrolled in the SFI program, making it North America’s largest sustainable forestry program and among the largest in the world. As a testament to the forest products industry’s strong commitment to the goal of sustainable forestry, participation in the SFI program is a condition of membership in the American Forest & Paper Association (AF&PA) — the national trade association for the forest products and paper industry. 

Q: Are there environmental advantages to using wood products?

A: Yes. Trees are a renewable resource. Most alternative materials come from nonrenewable resources, such as the petrochemicals used in making plastics and the ores used to make aluminum, iron and other metals.

Wood is also the most energy-efficient building material available today. When you compare the total energy costs of different kinds of building material – including the cost to acquire the raw material, transport it, process it into a useful product and then actually use it – wood far outshines its competitors. Steel wall studs require almost 9 times more energy to produce than do wood studs. A brick veneer wall requires 22 times more energy than wood siding, while aluminum siding requires 21 times more energy to produce than does an equivalent wood floor. In addition, forest products are recyclable and biodegradable.

Forest Products Facts

Q: How many people are employed in forestry and in the forest products industry? 

A: Forestry is much more than just logging. About 1.7 million people are directly employed in the planting, growing, managing and harvesting of trees and production of wood and paper products in all 50 states. As many as ten people are involved in the harvesting and milling of one tree.

The forest industry ranks among the top ten manufacturing employers in 42 states, with an annual payroll of about $51 billion. That figures counts only those people directly engaged in the industry, not the many more who indirectly make their living from forest management and forest products.

Q: How many trees does the average American use each year? 

A: Each person uses wood and paper products equivalent to what can be produced from one 18 inch in diameter 100 foot tree every year. And each year, the nation plants more than 5 new trees for each American.

Q: How much wood goes into building our homes?

A: Over 90 percent of all homes in the United States are built with wood-framed walls and roofs. The average single family American home (2,190 square feet) can contain 14,200 board feet of lumber and up to 14,000 square feet of panel products. That includes wood products ranging from structural beams and flooring to the sheathing, trim and panelling. Homebuilding, remodeling and home improvements are collectively the largest single use of lumber and wood products, accounting for about two-thirds of domestic wood-product consumption.

Posted in Blog, News & Announcements

America’s Abundant Forests

America’s Abundant Forests

Q: Are America’s forests in danger???

A: Not at all. Because the United States practices reforestation, its forests have actually grown in size over the past century. About one-third of the United States — 747 million acres — is covered with trees. In fact, we have more trees today than we had 70 years ago. And some 4 million more are planted each day. On the nation’s commercial forests, net annual growth exceeds harvests and losses to insects and disease by an impressive 47 percent each year.

Q: How much forestland is actually used for producing timber?

A: 504 million acres of America’s forestland is classified as “timberland,” productive forests capable of growing 20 cubic feet of commercial wood per acre per year. But not all of that is used for timber production.

A portion of that is permanently managed for uses such as recreation, streamside protection, and wildlife. About 52 million acres of U.S. forestland – an area larger than the states of North and South Carolina combined- are set aside by law for non-timber uses, such as parks or wilderness areas. Of the 191 million acres of forestland contained within the National Forest System owned and managed by the federal government, only 49 million acres are available for forest management.

Q: Who owns the nation’s commercial forests?

A: Of the nation’s 504 million acres of timberlands, 146 million acres, or 29 percent, are owned by federal, state and local governments. Fifty-eight percent of these productive woodlands -some 291 million acres- are held by some 10 million individual private landowners. About 67 million acres, or 13 percent of the total commercial timberlands, are owned by the forest products industry.

Managing A Renewable Resource

Q: Why do forest landowners sometimes take all of the trees out of an area?

A: That harvesting system is called “clearcutting” – removing all of the trees from a stand rather than picking and choosing. When compared to other techniques, clearcutting is often the best method for environmental as well as economic reasons. Some seedlings won’t grow in shade, so removing all of the mature trees ensures that enough light can reach new seedlings. Sometimes something calamitous, like a fire or windstorm, a tree disease, or an insect epidemic, requires that damaged trees be removed so that new trees can get a fresh start. And clearcutting requires fewer roads, which minimizes the expense as well as the disturbance to the environment. In each case, the type of harvest method used is dictated by the type of tree being harvested, the soil and terrain, wildlife habitat and the conditions needed to start the next forest.

Q: Do timber companies replant when they cut?

A: Yes. Forest products companies are in the business of growing and harvesting trees, so reforestation is important to them. In fact, more than 91 percent of all trees planted in America during the turn of the 21st Century were planted by forest product companies and private timberland owners. And logging companies pay a special fee for replanting and reforestation when they buy the right to harvest timber on state or national forests.

In some regions of the country, nature itself replants very efficiently. Throughout the Northeast and Lake States for example, foresters often manage harvested areas to promote natural regrowth from sprouting and seeds.

Q: How many trees are planted each year?

A: In 1999, the forestry community planted some 1.7 billion trees in the United States. That’s an average of more than 4 million new trees planted every day – more than 5 new trees a year for every man, woman and child in America.

Caring For The Environment

Q: What is the forest products industry doing to protect environmental quality?

A: The nation’s forest products companies announced their continued commitment to the goal of sustainable forestry through the Sustainable Forestry Initiative® (SFI) Program. Developed by professional foresters, conservationists and scientists, the SFI sm Program combines the perpetual growing and harvesting of trees with the long-term protection of wildlife, plants, soil and water quality. There are currently 107.8 million acres of forestland in North America enrolled in the SFI program, making it North America’s largest sustainable forestry program and among the largest in the world. As a testament to the forest products industry’s strong commitment to the goal of sustainable forestry, participation in the SFI program is a condition of membership in the American Forest & Paper Association (AF&PA) — the national trade association for the forest products and paper industry. 

Q: Are there environmental advantages to using wood products?

A: Yes. Trees are a renewable resource. Most alternative materials come from nonrenewable resources, such as the petrochemicals used in making plastics and the ores used to make aluminum, iron and other metals.

Wood is also the most energy-efficient building material available today. When you compare the total energy costs of different kinds of building material – including the cost to acquire the raw material, transport it, process it into a useful product and then actually use it – wood far outshines its competitors. Steel wall studs require almost 9 times more energy to produce than do wood studs. A brick veneer wall requires 22 times more energy than wood siding, while aluminum siding requires 21 times more energy to produce than does an equivalent wood floor. In addition, forest products are recyclable and biodegradable.

Forest Products Facts

Q: How many people are employed in forestry and in the forest products industry? 

A: Forestry is much more than just logging. About 1.7 million people are directly employed in the planting, growing, managing and harvesting of trees and production of wood and paper products in all 50 states. As many as ten people are involved in the harvesting and milling of one tree.

The forest industry ranks among the top ten manufacturing employers in 42 states, with an annual payroll of about $51 billion. That figures counts only those people directly engaged in the industry, not the many more who indirectly make their living from forest management and forest products.

Q: How many trees does the average American use each year? 

A: Each person uses wood and paper products equivalent to what can be produced from one 18 inch in diameter 100 foot tree every year. And each year, the nation plants more than 5 new trees for each American.

Q: How much wood goes into building our homes?

A: Over 90 percent of all homes in the United States are built with wood-framed walls and roofs. The average single family American home (2,190 square feet) can contain 14,200 board feet of lumber and up to 14,000 square feet of panel products. That includes wood products ranging from structural beams and flooring to the sheathing, trim and panelling. Homebuilding, remodeling and home improvements are collectively the largest single use of lumber and wood products, accounting for about two-thirds of domestic wood-product consumption.

Posted in Blog, News & Announcements

St. Louis Cardinals – Interesting Questions, Facts, & Info.

Interesting Questions, Facts, and Information 

St Louis Cardinals Baseball


  • Who was the first Cardinal to strikeout over 250 batters in a season?

Bob Gibson. Gibson had a stellar year in 1964 by striking out 245, but broke that team record again when he fanned 270 in 1965. He would break his own record again in 1970, by providing the team with 274 K’s in the season. In 1968 and 1970, Gibson well deserved, and won the Cy Young award.

 

  • Who was the first Cardinal pitcher to win 30 games in a season?

 

Dizzy Dean. Dean was the only Cardinal to win 30 games in a season during the 20th century. His record of 30-7 in 1934, earned him the National League MVP honors. He also had 7 saves, along with 195 strikeouts. He retired with a win-loss record of 150-83 over 12 seasons.

 

  • In which ballpark did the Cardinals play in their first season?

 

Robison Field. Robison Field was the Cardinals’ first home in 1900. The franchise started using Robison Field in 1893, when they were called the St.Louis Browns. The Cardinals received a total of 270,000 fans in the 1900 season, 100,000 less than their Perfectos the season before. The Cardinals finished in 5th place.

 

  • What team was the first to skunk the Cardinals, 4 games to 0 in a World Series?

 

New York Yankees. In the Cardinals 2nd World Series appearance in 1928, they fell to the bats of the New York Yankees 4 games to 0. The Cardinals were beaten by no less than 3 runs in each game. Both Lou Gehrig and Babe Ruth hit 3 home runs in the Series, Ruth’s all coming in game 4.

 

  • Who was the first Cardinal to obtain 200 hits in a season?

 

Jesse Burkett. All listed have broken the 200-hit barrier. Burkett was the first in 1901 when he hit 226 for the Cardinals. Jesse batted .376 in that season. Hornsby broke Burkett’s team record with 235 in 1921, and broke it again with 250 in 1922. This 250 mark would carry until the end of the century.

 

  • Who was the first Cardinal to hit 50 home runs in a season?

 

Mark McGwire. McGwire hit 70 in 1998. He followed that up with 65 in 1999. Mize hit 43 in 1940, in a team record that stood for almost 60 years.

 

  • In what year did the Cardinals win their first World Series?

 

1926. The Cardinals under the guidance of manager Rogers Hornsby, won the World Series in 1926, by beating the New York Yankees 4 games to 3. Hornsby also played 2nd base. The teams 20-game winner was Flint Rhem. They went to the World Series again in 1928 but lost, then lost again in 1930, but won it all in 1931 against the Philadelphia Athletics.

 

  • Before becoming the Cardinals, the team was called what?

 

Perfectos. The original franchise name was the Brown Stockings in 1882. From 1883-1898 they were the Browns. For one season only, they were called the Perfectos in 1899. The ‘Cardinals’ began their name in 1900. The Perfectos in 1899 were managed by Patsy Tebeau, who was also their first baseman. They finished the season with a record of 84-67 for a fifth place National League finish. Cy Young was on this team as he went 26-16 in the win-loss category.

 

  • Did the Cardinals win the 1928 World Series?

 

No. The Cardinals have won the 1926, 1931, 1934, 1942, 1944, 1946, 1964, 1967, and 1982 World Series.

 

  • Who was the first Cardinals player to hit for the cycle in the 1900s?

 

Cliff Heathcote. Cliff Heathcote hit for the cycle in 19 innings on June 13, 1918.

 

  • Who was the first Cardinal player to hit 40 homers?

 

Rogers Hornsby. Rogers Hornsby was the first to hit 40 for the Cardinals, also this is the year he won the Triple Crown in 1922.

 

  • On Opening Day 1996, what was the Cardinals attendance?

 

52,841. The Cardinals attendance this day was a record for Busch Stadium.

 

  • Was Ken Boyer ever a Cardinals manager?

 

Yes. From late 1978 to the early season of 1980, Ken Boyer was manager he went 166-190.

 

  • The first Cardinal to win the ERA crown was Mort Cooper?

 

 Bill Doak won the first two with ERAs of 1.72 in 1914.

 

  • Who won Game 7 of the 1931 World Series against the Philadelphia Athletics?

 

Burleigh Grimes. Burleigh Grimes beat George Earnshaw to win the 1931 World Series for the Cardinals.

 

  • Who won the first Cardinals’ season MVP award?

 

Rogers Hornsby. Hornsby won his first of two MVPs in 1925. Bob O’Farrell was the second Cardinal to win the award the following season in 1926.

 

  • From 1901 to 1918, the Cardinals played where?

 

Robison Field. The St. Louis Cardinals played at Robison Field where they averaged an attendance have 4,200 people per game.

 

  • The first Cardinal pitcher to pitch a no-hitter was Bob Gibson?

 

 Jesse Haines pitched his no-hitter on July 17, 1924.

 

  • How many Rookie Of The Year awards were awarded to a Cardinals player in the 1900s?

 

5. 1954 Wally Moon OF, 1955 Bill Virdon OF, 1974 Bake McBride OF, 1985 Vince Coleman OF, and 1986 Todd Worrell P.

 

  • How many perfect games did the Cardinals have in the 1900s?

 

0. The closest was by Bob Forsch on 25 Sep 83. He did not give up any hits or walks but there was an error commited by Ken Oberkfell.

 

  • Has there ever been a Cardinal that hit for the cycle but take longer than 9 innings to do it?

 

Yes. Cliff Heathcote 19 innings in 1918, Ken Boyer 11 innings in 1961, and Willie McGee 11 innings in 1984.

 

  • How many batting natural cycles (all done in the first four at-bats)did the Cardinals have in the first 100 years of the Major Leagues?

 

2. For those of you who don’t know what a “nature cycle” is, it’s when you hit for the cycle and do it in your first four at bats of the game. Ken Boyer did it June 1964, and John Mabry did it in 1996.

 

  • What was the name of the St. Louis Cardinals franchise in 1899?

 

St. Louis Perfectos. They were the St. Louis Brown Stockings in 1882, the St. Louis Browns 1883-1898, and the St. Louis Cardinals in 1900.

 

  • How many World Series rings did the Cardinals have in a 40-year span between 1930 and 1970?

 

7. 1931: Philadelphia Athletics, 7 games. 1934: Detroit Tigers, 7 games. 1942: New York Yankees, 5 games. 1944: St. Louis Browns, 6 games. 1946: Boston Red Sox, 7 games. 1964: New York Yankees, 7 games. 1967: Boston Red Sox, 7 games.

  • How many times did the Cardinals go to the World Series in the 1980s?

 

3. 1982: Won the World Series against the Milwaukee Brewers. 1985: Lost in the World Series against the Kansas City Royals. 1987: Lost in the World Series against the Minnesota Twins.

 

  • How many Cy Young Awards did Bob Gibson win?

 

2. In 1968, he was 22-9 with a 1.12 ERA and 268 strikeouts. In 1970, he was 23-7 with a 3.12 ERA and 274 strikeouts.

  • How many Cardinals played in the 1966 All-Star game that was held at Busch Stadium?

 

2. The outfielder Flood was making his second all-star appearance while it was McCarver the catcher’s first.

  • Which Cardinal second baseman was known as “Rajah”?

 

Rogers Hornsby. Hornsby was the first National League player to hit 300 home runs. He won the batting title seven times and the Triple Crown twice. He also managed the Cardinals as a player-manager in 1925 and 1926, leading the Cards to their first ever World Championship.

 

  • Which Cardinal played every single position on the diamond (literally) and was known as, “The Secret Weapon”?

 

Jose Oquendo. Oquendo literally played all nine positions in Major League games. Oquendo was known as a slick fielder, leading the National League in fielding percentage in 1989 and 1990, while serving along side Ozzie Smith as the Cardinals everyday second baseman. Oquendo became the first position player to record a decision as a pitcher, when he gave up the game-winning runs as he pitched against the Braves in a 19-inning loss.

 

  • Which Cardinal third baseman, outfielder, and first baseman earned the nickname, “El Hombre”?

 

Albert Pujols A latin play on “Stan the Man”. El Hombre literally means, “the man” in Spanish. Also known as “Phat Albert”, Pujols was born in the Dominican Republic and moved to Kansas City as a teenager. Arriving speaking very little English, it is alleged that on his first day of high school in America, he walked into the office and asked, “where baseball?” The Cardinals stole him the 13th round of the draft, out of Maple Woods Community College. He got a break in 2001 where he was allowed to stay with the big league team due to an injury to third baseman Bobby Bonilla; and the rest as they say is history.

 

  • Which Cardinal starting pitcher was known as “Dizzy”?

 

Jerome (Jay) Dean. Diz one 30 games in 1934 and was the last National League pitcher in the 20th century to accomplish that feat. His brother Paul, also known as Daffy, pitched for the Cardinals as well. Dizzy’s career was shortened by a broken toe he suffered in the 1937 All-Star game. He went to become a successful broadcaster after his playing days.

 


 

Posted in Blog, News & Announcements

Variable Data Printing: The Facts & How It Can Benefit Your Business.

Variable Data Printing
Variable Data Printing allows you to adjust your direct mail marketing campaign so it’s more customized and personalized. Using VDP means that you can create more attention-grabbing direct mail pieces that appeal to your target market. There are many benefits to using variable data printing with your direct mail marketing campaign.

People love to see their own name on printed materials!  They will be more inclined to absorb your messaging w/ variable data.

1. Increase Your ROI By Using Mailing List Data – Variable data printing allows you to create custom copy and information to grab people’s attention. Adjust the information on your direct mail campaign based on where people live, if they are single or married, or any other piece of information you have on your audience. Having custom information means your direct mail piece will be more appealing and relevant to the recipient.

2. Change Your Return Address or Phone Number – If you have multiple store locations or offices, print the one that will be the most convenient for each recipient. Or, depending on the campaign, you can change phone numbers so recipients can contact various people within the company.

3. Use Custom Fonts – Not only can addresses and mailing list data be printed directly on the direct mail piece, but all of it can be printed in a range of fonts so that the type complements the rest of your design or corporate identity. Depending on the variety of people you are targeting, you may want to consider changing your fonts.

4. Color Will Catch Attention – Color can catch more attention than regular black and white printings. Use color wisely to complement your current brand identity and highlight important information on your direct mail piece. Let the Columbine Design team whip up something fresh and exciting for you!

5. Include Images and Maps – You can change up your images and maps using variable data printing. Show exactly where your storefront is relative to their neighborhood. Or show custom images based on that specific person’s demographic. Make sure you are sending the right message to the ideal recipient.

Variable Data Printing IS THE NEXT STEP IN DIRECT MAIL MARKETING for your business. Don’t just send out generic campaigns to your audience. Create custom information that appeals to each recipient and makes them want to contact you.

Contact us:
Get with your Local Columbine Representative 
or…
Call (314) 423-2580 ext. 330 & ask for Nick!
nflemming@columbineprinting.com

 

Posted in Blog, News & Announcements

Print Works.

We are pleased to introduce our Print Works Package.  Ask us how Print Works can help benefit your business!  Shoot us an email or call (636) 346-7554 and ask for Nick.  Thanks!


Posted in Blog, News & Announcements

Power, Pollution, and the Internet

By James Glanz

SANTA CLARA, Calif. — Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.

A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.

Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.

To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.

Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”

A NATURAL PAIRING A data center in Ashburn, Va., seen past a Dominion Virginia Power substation serving it. Worldwide, such centers use the rough equivalent of the output of 30 nuclear power plants.

INSURANCE A row of backup generators, inside white housings, lines the back exterior of the Facebook data center in Prineville, Ore. They are to ensure service even in the event of a power failure.

LOW-TECH AMID HIGH-TECH A backup diesel generator at a large computer data center, one of six in the room. Combined, they could provide enough power for a community of 7,000 homes.

ENERGY HUNGRY Row after row after row of servers, at data centers around the world, perform the functions that constitute the cloud. They consume vast amounts of electricity, often wastefully.

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data. The study sampled about 20,000 servers in about 70 large data centers spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.

“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”

These physical realities of data are far from the mythology of the Internet: where lives are lived in the “virtual” world and all manner of memory is stored in “the cloud.”

The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.

Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.

“It’s a waste,” said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a nonprofit industry group. “It’s too many insurance policies.”

At least a dozen major data centers have been cited for violations of air quality regulations in Virginia and Illinois alone, according to state records. Amazon was cited with more than 24 violations over a three-year period in Northern Virginia, including running some of its generators without a basic environmental permit.

A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts.

Many of these solutions are readily available, but in a risk-averse industry, most companies have been reluctant to make wholesale change, according to industry experts.

Improving or even assessing the field is complicated by the secretive nature of an industry that is largely built around accessing other people’s personal data.

For security reasons, companies typically do not even reveal the locations of their data centers, which are housed in anonymous buildings and vigilantly protected. Companies also guard their technology for competitive reasons, said Michael Manos, a longtime industry executive. “All of those things play into each other to foster this closed, members-only kind of group,” he said.

That secrecy often extends to energy use. To further complicate any assessment, no single government agency has the authority to track the industry. In fact, the federal government was unable to determine how much energy its own data centers consume, according to officials involved in a survey completed last year.

The survey did discover that the number of federal data centers grew from 432 in 1998 to 2,094 in 2010.

To investigate the industry, The Times obtained thousands of pages of local, state and federal records, some through freedom of information laws, that are kept on industrial facilities that use large amounts of energy. Copies of permits for generators and information about their emissions were obtained from environmental agencies, which helped pinpoint some data center locations and details of their operations.

In addition to reviewing records from electrical utilities, The Times also visited data centers across the country and conducted hundreds of interviews with current and former employees and contractors.

Some analysts warn that as the amount of data and energy use continue to rise, companies that do not alter their practices could eventually face a shake-up in an industry that has been prone to major upheavals, including the bursting of the first Internet bubble in the late 1990s.

“It’s just not sustainable,” said Mark Bramfitt, a former utility executive who now consults for the power and information technology industries. “They’re going to hit a brick wall.”

Bytes by the Billions

Wearing an FC Barcelona T-shirt and plaid Bermuda shorts, Andre Tran strode through a Yahoo data center in Santa Clara where he was the site operations manager. Mr. Tran’s domain — there were servers assigned to fantasy sports and photo sharing, among other things — was a fair sample of the countless computer rooms where the planet’s sloshing tides of data pass through or come to rest.

Aisle after aisle of servers, with amber, blue and green lights flashing silently, sat on a white floor punctured with small round holes that spit out cold air. Within each server were the spinning hard drives that store the data. The only hint that the center was run by Yahoo, whose name was nowhere in sight, could be found in a tangle of cables colored in the company’s signature purple and yellow.

“There could be thousands of people’s e-mails on these,” Mr. Tran said, pointing to one storage aisle. “People keep old e-mails and attachments forever, so you need a lot of space.”

This is the mundane face of digital information — player statistics flowing into servers that calculate fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices. It is only when the repetitions of those and similar transactions are added up that they start to become impressive.

Each year, chips in servers get faster, and storage media get denser and cheaper, but the furious rate of data production goes a notch higher.

Jeremy Burton, an expert in data storage, said that when he worked at a computer technology company 10 years ago, the most data-intensive customer he dealt with had about 50,000 gigabytes in its entire database. (Data storage is measured in bytes. The letter N, for example, takes 1 byte to store. A gigabyte is a billion bytes of information.)

Today, roughly a million gigabytes are processed and stored in a data center during the creation of a single 3-D animated movie, said Mr. Burton, now at EMC, a company focused on the management and storage of data.

Just one of the company’s clients, the New York Stock Exchange, produces up to 2,000 gigabytes of data per day that must be stored for years, he added.

EMC and the International Data Corporation together estimated that more than 1.8 trillion gigabytes of digital information were created globally last year.

“It is absolutely a race between our ability to create data and our ability to store and manage data,” Mr. Burton said.

About three-quarters of that data, EMC estimated, was created by ordinary consumers.

With no sense that data is physical or that storing it uses up space and energy, those consumers have developed the habit of sending huge data files back and forth, like videos and mass e-mails with photo attachments. Even the seemingly mundane actions like running an app to find an Italian restaurant in Manhattan or a taxi in Dallas requires servers to be turned on and ready to process the information instantaneously.

The complexity of a basic transaction is a mystery to most users: Sending a message with photographs to a neighbor could involve a trip through hundreds or thousands of miles of Internet conduits and multiple data centers before the e-mail arrives across the street.

“If you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right,” said Bruce Taylor, vice president of the Uptime Institute, a professional organization for companies that use data centers.

To support all that digital activity, there are now more than three million data centers of widely varying sizes worldwide, according to figures from the International Data Corporation.

Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data center energy use for more than a decade. DatacenterDynamics, a London-based firm, derived similar figures.

The industry has long argued that computerizing business transactions and everyday tasks like banking and reading library books has the net effect of saving energy and resources. But the paper industry, which some predicted would be replaced by the computer age, consumed 67 billion kilowatt-hours from the grid in 2010, according to Census Bureau figures reviewed by the Electric Power Research Institute for The Times.

Direct comparisons between the industries are difficult: paper uses additional energy by burning pulp waste and transporting products. Data centers likewise involve tens of millions of laptops, personal computers and mobile devices.

Chris Crosby, chief executive of the Dallas-based Compass Datacenters, said there was no immediate end in sight to the proliferation of digital infrastructure.

“There are new technologies and improvements,” Mr. Crosby said, “but it still all runs on a power cord.”

‘Comatose’ Power Drains

Engineers at Viridity Software, a start-up that helped companies manage energy resources, were not surprised by what they discovered on the floor of a sprawling data center near Atlanta.

Viridity had been brought on board to conduct basic diagnostic testing. The engineers found that the facility, like dozens of others they had surveyed, was using the majority of its power on servers that were doing little except burning electricity, said Michael Rowan, who was Viridity’s chief technology officer.

A senior official at the data center already suspected that something was amiss. He had previously conducted his own informal survey, putting red stickers on servers he believed to be “comatose” — the term engineers use for servers that are plugged in and using energy even as their processors are doing little if any computational work.

“At the end of that process, what we found was our data center had a case of the measles,” said the official, Martin Stephens, during a Web seminar with Mr. Rowan. “There were so many red tags out there it was unbelievable.”

The Viridity tests backed up Mr. Stephens’s suspicions: in one sample of 333 servers monitored in 2010, more than half were found to be comatose. All told, nearly three-quarters of the servers in the sample were using less than 10 percent of their computational brainpower, on average, to process data.

The data center’s operator was not some seat-of-the-pants app developer or online gambling company, but LexisNexis, the database giant. And it was hardly unique.

In many facilities, servers are loaded with applications and left to run indefinitely, even after nearly all users have vanished or new versions of the same programs are running elsewhere.

“You do have to take into account that the explosion of data is what aids and abets this,” said Mr. Taylor of the Uptime Institute. “At a certain point, no one is responsible anymore, because no one, absolutely no one, wants to go in that room and unplug a server.”

Kenneth Brill, an engineer who in 1993 founded the Uptime Institute, said low utilization began with the field’s “original sin.”

In the early 1990s, Mr. Brill explained, software operating systems that would now be considered primitive crashed if they were asked to do too many things, or even if they were turned on and off. In response, computer technicians seldom ran more than one application on each server and kept the machines on around the clock, no matter how sporadically that application might be called upon.

So as government energy watchdogs urged consumers to turn off computers when they were not being used, the prime directive at data centers became running computers at all cost.

A crash or a slowdown could end a career, said Michael Tresh, formerly a senior official at Viridity. A field born of cleverness and audacity is now ruled by something else: fear of failure.

“Data center operators live in fear of losing their jobs on a daily basis,” Mr. Tresh said, “and that’s because the business won’t back them up if there’s a failure.”

In technical terms, the fraction of a computer’s brainpower being used on computations is called “utilization.”

McKinsey & Company, the consulting firm that analyzed utilization figures for The Times, has been monitoring the issue since at least 2008, when it published a report that received little notice outside the field. The figures have remained stubbornly low: the current findings of 6 percent to 12 percent are only slightly better than those in 2008. Because of confidentiality agreements, McKinsey is unable to name the companies that were sampled.

David Cappuccio, a managing vice president and chief of research at Gartner, a technology research firm, said his own recent survey of a large sample of data centers found that typical utilizations ran from 7 percent to 12 percent.

“That’s how we’ve overprovisioned and run data centers for years,” Mr. Cappuccio said. “ ‘Let’s overbuild just in case we need it’ — that level of comfort costs a lot of money. It costs a lot of energy.”

Servers are not the only components in data centers that consume energy. Industrial cooling systems, circuitry to keep backup batteries charged and simple dissipation in the extensive wiring all consume their share.

In a typical data center, those losses combined with low utilization can mean that the energy wasted is as much as 30 times the amount of electricity used to carry out the basic purpose of the data center.

Some companies, academic organizations and research groups have shown that vastly more efficient practices are possible, although it is difficult to compare different types of tasks.

The National Energy Research Scientific Computing Center, which consists of clusters of servers and mainframe computers at the Lawrence Berkeley National Laboratory in California, ran at 96.4 percent utilization in July, said Jeff Broughton, the director of operations. The efficiency is achieved by queuing up large jobs and scheduling them so that the machines are running nearly full-out, 24 hours a day.

A company called Power Assure, based in Santa Clara, markets a technology that enables commercial data centers to safely power down servers when they are not needed — overnight, for example.

But even with aggressive programs to entice its major customers to save energy, Silicon Valley Power has not been able to persuade a single data center to use the technique in Santa Clara, said Mary Medeiros McEnroe, manager of energy efficiency programs at the utility.

“It’s a nervousness in the I.T. community that something isn’t going to be available when they need it,” Ms. McEnroe said.

The streamlining of the data center done by Mr. Stephens for LexisNexis Risk Solutions is an illustration of the savings that are possible.

In the first stage of the project, he said that by consolidating the work in fewer servers and updating hardware, he was able to shrink a 25,000-square-foot facility into 10,000 square feet.

Of course, data centers must have some backup capacity available at all times and achieving 100 percent utilization is not possible. They must be prepared to handle surges in traffic.

Mr. Symanski, of the Electric Power Research Institute, said that such low efficiencies made sense only in the obscure logic of the digital infrastructure.

“You look at it and say, ‘How in the world can you run a business like that,’ ” Mr. Symanski said. The answer is often the same, he said: “They don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”

The Best-Laid Plans

In Manassas, Va., the retailing colossus Amazon runs servers for its cloud amid a truck depot, a defunct grain elevator, a lumberyard and junk-strewn lots where machines compress loads of trash for recycling.

The servers are contained in two Amazon data centers run out of three buildings shaped like bulky warehouses with green, corrugated sides. Air ducts big enough to accommodate industrial cooling systems sprout along the rooftops; huge diesel generators sit in rows around the outside.

The term “cloud” is often generally used to describe a data center’s functions. More specifically, it refers to a service for leasing computing capacity. These facilities are primarily powered from the national grid, but generators and batteries are nearly always present to provide electricity if the grid goes dark.

The Manassas sites are among at least eight major data centers Amazon operates in Northern Virginia, according to records of Virginia’s Department of Environmental Quality.

The department is on familiar terms with Amazon. As a result of four inspections beginning in October 2010, the company was told it would be fined $554,476 by the agency for installing and repeatedly running diesel generators without obtaining standard environmental permits required to operate in Virginia.

Even if there are no blackouts, backup generators still emit exhaust because they must be regularly tested.

After months of negotiations, the penalty was reduced to $261,638. In a “degree of culpability” judgment, all 24 violations were given the ranking “high.”

Drew Herdener, an Amazon spokesman, agreed that the company “did not get the proper permits” before the generators were turned on. “All of these generators were all subsequently permitted and approved,” Mr. Herdener said.

The violations came in addition to a series of lesser infractions at one of Amazon’s data centers in Ashburn, Va., in 2009, for which the company paid $3,496, according to the department’s records.

Of all the things the Internet was expected to become, it is safe to say that a seed for the proliferation of backup diesel generators was not one of them.

Terry Darton, a former manager at Virginia’s environmental agency, said permits had been issued to enough generators for data centers in his 14-county corner of Virginia to nearly match the output of a nuclear power plant.

“It’s shocking how much potential power is available,” said Mr. Darton, who retired in August.

No national figures on environmental violations by data centers are available, but a check of several environmental districts suggests that the centers are beginning to catch the attention of regulators across the country.

Over the past five years in the Chicago area, for example, the Internet powerhouses Savvis and Equinix received violation notices, according to records from the Illinois Environmental Protection Agency. Aside from Amazon, Northern Virginia officials have also cited data centers run by Qwest, Savvis, VeriSign and NTT America.

Despite all the precautions — the enormous flow of electricity, the banks of batteries and the array of diesel generators — data centers still crash.

Amazon, in particular, has had a series of failures in Northern Virginia over the last several years. One, in May 2010 at a facility in Chantilly, took businesses dependent on Amazon’s cloud offline for what the company said was more than an hour — an eternity in the data business.

Pinpointing the cause became its own information glitch.

Amazon announced that the failure “was triggered when a vehicle crashed into a high-voltage utility pole on a road near one of our data centers.”

As it turns out, the car accident was mythical, a misunderstanding passed from a local utility lineman to a data center worker to Amazon headquarters. Instead, Amazon said that its backup gear mistakenly shut down part of the data center after what Dominion Virginia Power said was a short on an electrical pole that set off two momentary failures.

Mr. Herdener of Amazon said the backup system had been redesigned, and that “we don’t expect this condition to repeat.”

The Source of the Problem

Last year in the Northeast, a $1 billion feeder line for the national power grid went into operation, snaking roughly 215 miles from southwestern Pennsylvania, through the Allegheny Mountains in West Virginia and terminating in Loudon County, Va.

The work was financed by millions of ordinary ratepayers. Steven R. Herling, a senior official at PJM Interconnection, a regional authority for the grid, said the need to feed the mushrooming data centers in Northern Virginia was the “tipping point” for the project in an otherwise down economy.

Data centers in the area now consume almost 500 million watts of electricity, said Jim Norvelle, a spokesman for Dominion Virginia Power, the major utility there. Dominion estimates that the load could rise to more than a billion watts over the next five years.

Data centers are among utilities’ most prized customers. Many utilities around the country recruit the facilities for their almost unvarying round-the-clock loads. Large, steady consumption is profitable for utilities because it allows them to plan their own power purchases in advance and market their services at night, when demand by other customers plummets.

Mr. Bramfitt, the former utility executive, said he feared that this dynamic was encouraging a wasteful industry to cling to its pedal-to-the-metal habits. Even with all the energy and hardware pouring into the field, others believe it will be a challenge for current methods of storing and processing data to keep up with the digital tsunami.

Some industry experts believe a solution lies in the cloud: centralizing computing among large and well-operated data centers. Those data centers would rely heavily on a technology called virtualization, which in effect allows servers to merge their identities into large, flexible computing resources that can be doled out as needed to users, wherever they are.

One advocate of that approach is Mr. Koomey, the Stanford data center expert. But he said that many companies that try to manage their own data centers, either in-house or in rental spaces, are still unfamiliar with or distrustful of the new cloud technology. Unfortunately, those companies account for the great majority of energy usage by data centers, Mr. Koomey said.

Others express deep skepticism of the cloud, saying that the sometimes mystical-sounding belief in its possibilities is belied by the physicality of the infrastructure needed to support it.

Using the cloud “just changes where the applications are running,” said Hank Seader, managing principal for research and education at the Uptime Institute. “It all goes to a data center somewhere.”

Some wonder if the very language of the Internet is a barrier to understanding how physical it is, and is likely to stay. Take, for example, the issue of storing data, said Randall H. Victora, a professor of electrical engineering at the University of Minnesota who does research on magnetic storage devices.

“When somebody says, ‘I’m going to store something in the cloud, we don’t need disk drives anymore’ — the cloud is disk drives,” Mr. Victora said. “We get them one way or another. We just don’t know it.”

Whatever happens within the companies, it is clear that among consumers, what are now settled expectations largely drive the need for such a formidable infrastructure.

“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”

 

A version of this article appeared in print on September 23, 2012, on page A1 of the New York edition with the headline: Power, Pollution and the Internet.
Posted in Blog, Uncategorized

February Fun Facts Quiz

1. Which of the following are celebrated every year during the month of February?

A. Groundhog Day
B. Valentines Day
C. National Bird Feeding month
D. Black History month
E. All of the above

2. In honor of Presidents Day – Which of the following Presidents call February their birth month? 

A. George Washington
B. Abraham Lincoln
C. William Henry Harrison
D. Ronald Reagan
E. All of the above 

3. In the spirit of Groundhog Day (Feb. 2nd, 2013), what is the name of our very own groundhog here in St. Louis?
A. Punxsutawney Phil
B. Pippi
C. Woody
D. Grady the Groundhog

 4. All of these events occurred in the month of February except:

A. Feb. 24th, 1868 – The first parade to have floats was staged at Mardi Gras in New Orleans, Louisiana.
B. Feb.  1st, 1865 – President Abraham Lincoln signed the 13th Amendment to the U.S. Constitution (beyond the Bill of Rights).
C. Feb. 18th, 1885 – The Adventures of Huckleberry Finn by Mark Twain was first published.
D. Feb. 6th, 1607 – Captain John Smith landed in Jamestown.















SPOILER ALERT!!! CORRECT ANSWERS BELOW:

 

  1. E. All of the above – Yes, February is National Bird Feeding month!  This celebratory month was created to educate the public on the wild bird feeding and watching hobby. On February 23rd, 1994, John Porter (R-IL) proclaimed February as National Bird Feeding month when he read a resolution into the Congressional Record.
  2. E. All of the above – these 4 gentlemen are the only Presidents born in the month of February…George Washington – born February 22nd, 1732.  Abraham Lincoln – born February 12th, 1809.  William Henry Harrison – born February 9th , 1773.  Ronald Reagan – born February 6th, 1911.
  3. B. Pippi.  2 year old Pippi resides right here in town at the St. Louis zoo!  Punxsutawney Phil hails from Punxsutawney, Pennsylvania.  Grady the Groundhog – Chimney Rock, North Carolina.   Woody – Howell, Michigan.  The largest Groundhog Day celebration is held in Punxsutawney, Pennsylvania, where crowds as large as 40,000 have gathered to celebrate the holiday since at least 1886!
  4. D. Feb. 6th, 1607 Captain John Smith lands in Jamestown.  The Captain landed in Jamestown on May 2nd, 1607 and forever changed the dynamic of the “New World”.

 

Posted in Blog, Uncategorized

What’s trending in the world of Paper & Print?

Ed Kniep III – Chairman of the Board – SKH Paper

















*At it’s peak there were 54,000 printers in 1994 in the US and today that number is 30,000 (PIA). *Between 08′ and 12′ in the US, 2.5 BILLION lbs. of Uncoated Free Sheet paper were taken out of production.  *The US paper and forest industries today employ nearly 1 million people.

Ed’s comments on these 3 trending topics in the industry:

According to the Printing Industry of America, at its peak there were 54,000 printers in the US and today the number is approximately 30,000. It is projected that 23,500 will survive by 2020. These survivors will adapt to new digital printing technology and with full service marketing incorporating data management, printing on different substrates, connecting digital media, mailing services, design and customer specific requests. Printing is alive and well and the survivors will grow and prosper.

 

Between 2008 and 2012 in the US, 60,857 truckloads on Uncoated Free Sheet paper were taken out of production. These tons included offset, forms bond, envelope, text, cover, writing and tablet. Surely the Great Recession was responsible for many of the machine closures. But this trend was going on before 2008 with the internet replacing the need to print everything from newspapers to bank statements to newsletters to annual reports. The small paper manufacturers are gone, but the large producers like Domtar, International Paper, Boise and Georgia Pacific continue to make product and will prosper in the future.

 

The organization called Two Sides has stated that the US paper and forest industries employ 900,000 people. Of course these people are not just involved in the manufacture, distribution and export of printing papers. They are also involved with tissue, towel, packaging, containerboard, boxboard, specialty papers and many timber products. It is estimated that these 900,000 earn $50 billion annually. As our economic recovery moves forward, the number employed will grow and remain an important part of our economy. 

 

Thanks,

Ed

Posted in Blog, Uncategorized

Using Paper = More Trees


“We are in danger of losing the magnificent working forests in America as properties continue to change hands according to economic and environmental pressures.  It is time we focused on keeping working forests as forests, because of the role they play in preserving the green infrastructure of the nation.”
~ Lawrence Selzer – President and CEO, The Conservation Fund


Land provides social, recreational and financial benefits. It can generate revenue for owners in many ways – one of the most environmentally friendly is by growing trees.  Because 70 percent of U.S. timberland is privately owned, it is imperative that both the social and economic incentives to continue growing trees outweigh the financial rewards of sacrificing forests to development. Print (paper more specifically!) grows trees.
Remember…
Recycle. Reuse. Rejuvenate.

Posted in Blog

Hit us up!

Check out our November Newsletter. Subscribe at the top of the page to receive a monthly communication from us here @ Columbine. You can expect industry related news, trivia games, geographically relevant info, promos, and special offers. Thanks!


Posted in Blog
Columbine Printing Co.
10415 Trenton Ave.
St. Louis, MO 63132

314-423-2580
Fax: 314-423-0698
E-Mail Us
Sign up for Columbine Printing Co. Newsletter
* = required field


Request a Quote


Or Give Us a Call at: 314-423-2580