.comment-link {margin-left:.6em;} -->


The personal journal of technology journalist and conference speaker Randall S. Newton.

View Randall Newton's profile on LinkedIn

Thursday, October 27, 2005

Now The Cows are Milking Themselves

As a youth, I took my turns milking the family cow. Once my hands were strong enough to do it without resting, I found it a peaceful endeavor. The milk would flow, the cow would much on grain, the barn cats would circle about. It was quiet, peaceful, and relaxing.

Milking machines have been around for generations. They are all various interpretations of a vacumn cleaner, re-engineered for the cow udder. (I suppose there are varieties for miking goats and sheep, but I've only been around cow dairing.) Now, The Times of London is reporting that on some UK and European farms, robotic milking machines allow the cows to milk themselves.

“The cows set their own agenda,” said Neil Rowe, manager of Manor Farm at Marcham in Oxfordshire which has switched to the system. “It’s about autonomy, it’s about enrichment, it’s about stepping back and allowing the cows and the system to develop a relationship.”
As implemented on Rowe's farm, the system even automatically mucks out the milking parlor as needed. I may wax nostalgic about milking old Bossie, but you'll never get me to admit there was any pleasure to be found in shoveling out the manure.

Wednesday, October 26, 2005

Google Exec: Content and URLs Now Rule

Power in computing has shifted from proprietary, Microsoft API's to URL's on the web, says Google vice president Adam Bosworth. His comments are in a new InfoWorld article by Paul Krill.

Applications used to be built via a "control model," with that weapon of control being leveraged by Microsoft via an API, Bosworth said. "This model was kind of a beautiful thing if you were Microsoft," said Bosworth, who formerly worked at Microsoft and BEA Systems (Profile, Products, Articles).

"But the fact is, today I think this model is totally irrelevant," he said.

Full article: Google exec touts communities, content over APIs | InfoWorld | News | 2005-10-21 | By Paul Krill

Tuesday, October 25, 2005

Media Economics and the Web, Part 1

There’s a buzz reaching critical mass about a concept called “Web 2.0.” In case you aren’t familiar with the term, here is the Wikipedia definition for “Web 2.0”:
Web 2.0 is a term often applied to a perceived ongoing transition of the World Wide Web from a collection of websites to a full-fledged computing platform serving web applications to end users. Ultimately Web 2.0 services are expected to replace desktop computing applications for many purposes.
It is a heady goal these Web 2.0 Zionists have, transforming the Web we know today into something more robust and useful. If you go back and read the early thinking about the World Wide Web, some of the ideas that are now being called Web 2.0 were there from the beginning.

Various new programming languages and tools, (Ruby on Rails, the Google Application Programming Interfaces (API’s), Ajax) are considered to be Web 2.0 tools. Microsoft, as usual, is attempting to create a Trojan horse in the Ajax/Web 2.0 space with a programming tool set called Atlas. There is a hot new annual conference called Web 2.0, devoted to discovering the next big ideas and companies that will, supposedly, reshape the Web. The company/product I wrote about in a recent blog entry (Meebo, a Web-based instant messaging client) is a typical Web 2.0 application.

I’ve been thinking about writing a Web 2.0 overview for this blog for quite a while. What flipped me into blogging mode was a PowerPoint presentation I read today entitled “The New Economics of Media: Micromedia, Connected Consumption and the Snowball Effect” by Umair Haque, a former Oxford Ph.D. student who is now a consultant on innovation. To quote Haque:

In a Media 2.0 world, mass media industry economics are inverted, value chains begin to atomize, and traditional dominant strategies, like blockbusters, begin to fail. What resources and capabilities do incumbents and new entrants need to succeed?

I started to read the PowerPoint presentation, and two hours later I had 11 pages of hand-written notes. What follows is my distillation and interpretation, which will take two or three separate items to complete.

Haque refers to mainstream media as “Media 1.0” and the new “micromedia” as “Media 2.0.” Let me explain each briefly.

In the world of Media 1.0, the cost to create media content (a newspaper, a magazine, a TV program, etc.) was high, while the cost to acquire customers was low, and dropped as the quantity of customers increased. It took a lot of money to buy a press (or a license to broadcast), hire a staff, create the media product, and to market the product. But once the money was invested, the returns were pretty good. Money spent on marketing and retailing were the best investment Media 1.0 companies could make, because they increased the number of consumers. Consumers were an abundant resource, which means they were cheap to acquire.

Media 1.0 products were created to satisfy the greatest number of consumers, who had few choices. That’s why it is called mass media. As an economist would see it, in a Media 1.0 world potential consumers are just standing around waiting for media to consume. Think of Media 1.0 as standing in line at a soup kitchen.

Media 1.0 may have TV and radio networks, but there is no network effect of the Internet at play. Media 1.0 is a one-to-many publishing process. Media 2.0, by contrast, is all about the Internet. Media 2.0 offers one-to-many, many-to-many and (even more importantly) many-to-one publishing. Consumers are becoming linked in a two-way media process (some replace “consumers” at this point with “prosumers”).

In the Media 2.0 era, the cost to create media content is low, low, low. Blogger.com is free; other blog sites are either free or cheap. Even the top-flight content management systems, as used by robust online news sites, cost a pittance compared to a printing press or a TV studio. The result is an explosion of content providers. What becomes expensive in Media 2.0 is the cost—in relative terms—of acquiring customers. Consumer options are abundant, but the amount of time they can devote to any one website or other media source is small. The same consumers now have a near-infinite number of content providers to choose from. If Media 1.0 was a soup kitchen, Media 2.0 is a very long salad bar.

For both forms of media, attention is the name of the game. Old media and new media both need the attention of consumers or they are dead. In Media 1.0, because attention was abundant (which to an economist means it is cheap) it made sense to invest in infrastructure and marketing to harvest the readily available attention. Such investment was in-kind with the media method: newspapers would give away copies, radio stations would run contests, TV stations would produce extravaganzas with big-name stars, and studios would produce blockbusters.

In Media 2.0, attention is scarce (which to an economist means it is expensive). Investing in infrastructure becomes a bad investment. You can buy a bigger press and print more copies, but the people on the street are too distracted by the thousands of other content providers to notice you. The right kind of investment in Media 2.0 is (just as it was for Media 1.0) is an in-kind investment, putting your money back into the means of distribution. For Media 2.0, that would be search, linked content, high-value-content, and publishing fresh content more often.

There is another important distinction between Media 1.0 and Media 2.0, regarding the pricing of content. The Web forces a hyper-deflation of price as supply explodes. When I took over as Editor-in-Chief and Publisher of A/E/C Automation Newsletter in 2002, it has been an expensive monthly executive newsletter for more than 25 years, but in recent years was losing subscribers at an increasing rate. By 2002, all of its competition was on the web, available to read for free. Never mind that the quality of the content was better, free was winning. So as quickly as possible, I turned the print newsletter into first a subscriber-only website, then to a free, advertising supported website, renamed as AECnews.com. I now number my readers in the thousands instead of hundreds, and I am paid by my advertisers, not my readers.

Haque argues that the solution to hyper-deflation of price is to deflate the amount of content to match the deflation of price. That means turning content into micro-content. Several prominent bloggers have proven that several small items a day generates more page views (and more ad impressions) than a single longer daily item. I need to take this lesson to heart at AECnews.com. When I “live blogged” a recent software users conference, with several items a day, my page views skyrocketed compared to my normal traffic flow.

Micro-content can take several forms in Media 2.0. For the printed word, it is the blog entry instead of the article. For recorded music, it is the song (as mp3 file), not the album (as CD). For video, it is the clip, not the show. Because the content chunk is so small, consumers are starting to reorganize their micro-content to suit their purposes. Apple recognized this earlier than most corporations, and reaped huge Media 2.0 mindshare with their slogan “Rip. Mix. Burn.” As more content becomes micro-content, consumers will want to rip and mix all the forms of media they consume. This is known as aggregation. Successful Media 2.0 organizations need to provide the tools of aggregation as well as the content (as does Apple with the iPod and iTunes). Content providers who want to move into the Media 2.0 era need to invest in the tools of micro-content distribution: RSS feeds, search engines, Ajax-style web-based applets, MP3 distribution services. Some of the products we need for Media 2.0 haven’t been invented yet.

Haque argues that while aggregation is the inevitable result of virtually free micro-content, aggregation can become a value destroyer. What is needed is smart aggregation—tools for more efficient use of resources. Smart aggregation will organize (on an individual level) content plus information, expectation, and preferences about content. Smart aggregators, Haque says, will not only rebundle content, they will also rebundle information about the content (metadata), information about the network, information about the applications in use, and information about the device running all the above.

I still have several pages of notes from my reading of Haque’s ideas, but this article has already violated a key concept of Media 2.0—create micro-chunks of content. I will continue with another entry soon. Let me leave you with this:

Media 1.0 thinking: Blog item = article.
Media 1.5 thinking: Blog item = item + comments.
Media 2.0 thinking: Blog item = item + comments + links.
Media 2.5 thinking: Blog item = item + comments + links + citation + trackback + tags + smart aggregation.

Lessons from the Mess that is FEMA

In my job as Editor-in-Chief of AECnews.com, I daily receive 10-20 press releases that have absolutely nothing to do with technology for design and construction. Usually I scan the headline and press delete. But the ideas in the press release below caught my attention, so I'm sharing it here.


Listening at Warp Speed for Organizational Adaptation:

The Answer is Blowing in the Wind after Katrina

Under the newly formed Department of Homeland Security, FEMA’s entire workplace was revamped and new leadership was installed at the highest levels, without input from those veterans who had worked at FEMA for decades. But rapid change without listening to the workforce will add insult to injury, according to experts like Dr. Elaine Gagne, award winning organizational development specialist. She and other professionals who study organizational change cite the breakdown of emergency relief programs in Katrina’s aftermath as a textbook example of what happens when change is not coupled with thorough organizational alignment and workplace communication.

Veteran employees of FEMA have warned for years that the organization suffers from poor management, squandered workforce talent, and attrition of experienced personnel or “brain drain.” But those warnings appear not to have been integrated into the organization re-design.

In truth, how many workplaces could handle their own “Katrina?” According to organizational systems consultants and experts in the field of corporate management, when organizations cannot adapt to rapid change, they fail.

But how many organizations will heed the lessons of Katrina? Are the people, systems, and processes strong enough to bear the “perfect storm?” Does leadership factor in the worst case scenarios in addition to the best case scenarios? Is the organization transparent enough so that each part knows the relevance of other parts and how they depend on each other? Would they pull together or apart in a time of crisis?

These and other questions are essential for organizational leaders today. And, according to Dr. Gagne, the answers lie in engaging the entire organization in becoming adaptable to a changing environment whether it be a slow evolution or one with the speed and power of Katrina. Dr. Gagne says, “The workforce today is not merely about machines, brawn, or even brain. Successful organizations also engage the heart elements like creativity, passion, commitment, loyalty, and morale. Leadership cannot survive without these elements.”

Workforce morale is an issue in organizational success. Employment data from the Bureau of Labor Statistics shows that employees are less satisfied with their jobs and their wages. Low morale costs companies their competitive advantage, it blocks creative product research and design ideas, and it translates into more expensive, slower production. However, corporate winners today are faster, more efficient, and more agile.

A recent National Business Research Institute study confirms Dr. Gagne’s premises for workforce involvement. NBRI found that root cause of low job satisfaction was that employees were disconnected from the organization’s short and long term goals, vision, and mission and recommends communicating plans throughout the organization and reinforcing those plans in daily activities.

Dr. Gagne addresses these issues in her acclaimed book Engage! Roadmap for Workforce-Driven Change in a Warp-Speed World, where she describes that successful change involves input from the entire workforce, coupled with a proven roadmap, a unique approach to organizational vision, and an execution plan that has lock-step accountability. According to Dr. Gagne, successful changes in organizations like FEMA must involve everyone in the workforce working together to share ideas, set goals, and?most importantly?implement them as a team.

Although that may seem like a nearly impossible logistical task for a gigantic organization set in its bureaucratic ways, it is the same principle that applies to responding to a large natural disaster. For disaster relief to succeed, it has to be implemented within a system that can adapt to rapid changes and take advantage of clear and accurate internal communication. Scores of people have to be able to communicate throughout the chain of command, without stepping on each other’s messages, and then be supported by the entire organization so that they can act with confidence.

Dr. Gagne believes it is possible to fix what is wrong with FEMA. The silver lining in this cloud of catastrophe is that in the process of fixing FEMA, we will also learn new and improved ways to respond to other organizational problems that plague us in a rapidly evolving world. And that includes overnight problems brought on by unforeseen disasters like hurricanes and terrorist attacks.

T. Elaine Gagne is President and founder of Renaissance Services, Inc and CEO of Insight Systems Consulting.

For more information, please see www.engagechange.com or www.is-consulting.com

Monday, October 24, 2005

Halloween: A Christian Holiday to Mock Satan

When I was a small child in the late 1950's and 1960's, Trick or Treat on Halloween was an innocent custom with little resistance from the Christian community. Starting in the 1980's, however, many conservative Christians began to teach against Halloween. They claimed that to participate in the customs of Halloween was to participate in a pagan festival, or to give glory to evil and the occult. A few incidents of candy tampering didn't help, and many parents stopped allowing their children to go out for Trick or Treat on Halloween night.

On one level, I never bought into that line of thinking. After all, I reasoned, I participated in Trick or Treat every year as a child, and I certainly didn't grow up a pawn of Satan. But, when I became a father in 1984 when I married Teresa (who had two children from her previous marriage) we decided to "offer" our children alternatives to Halloween. It seemed the right thing to do.

This continued for several years, until one year when we had five of our eventual nine children, we were overnight guests of my (recently departed) Aunt June in Spokane. It was Halloween. Darkness fell, and children from all over the neighborhood innocently started knocking on doors. Teresa and I looked at each other with one of those "what should we do?" looks. I said, "You know, I don't know why we are so against Trick or Treat." I never had a witness in my heart that we had avoided evil or given glory to God by abandoning Halloween and Trick or Treat. She agreed. So we quickly improvised Halloween costumes for the kids (two were in strollers), and Teresa walked them around the neighborhood. Since then, we have allowed our children to enjoy Trick or Treat.

I bring this up, not only because it is almost that time of year again, but because of an excellent short article on the origins of Halloween. "Halloween: A Discernment Exercise" is written by a Christian teacher I hold in high esteem, James B. Jordan. His book, "Through New Eyes," is a must read for the Christian seeking to move beyond modern Christian pop culture and pop theology. In the article, Jordan cites the historical record to say that Halloween, as the opening of All Saints Day, was a time to celebrate Christ's victory over the forces of darkness. Halloween evolved as a way to mock Satan. A quote:

What is the means by which the demonic realm is vanquished? In a word: mockery. Satan’s great sin (and our great sin) is pride. Thus, to drive Satan from us, we ridicule him. This is why the custom arose of portraying Satan in a ridiculous red suit with horns and a tail. Nobody thinks the devil really looks like this; the Bible teaches that he is the fallen Arch-Cherub. Rather, the idea is to ridicule him because he has lost the battle with Jesus and he no longer has power over us.

(The tradition of mocking Satan and defeating him through joy and laughter plays a large role in Ray Bradbury’s classic novel, Something Wicked This Way Comes, which is a Halloween novel.)

The gargoyles that were placed on the churches of old had the same meaning. They symbolized the Church ridiculing the enemy—they stick out their tongues and make faces at those who would assault the Church. Gargoyles are not demonic; they are believers ridiculing the defeated demonic army.

Thus, the defeat of evil and of demonic powers is associated with Halloween. For this reason, Martin Luther posted his 95 challenges to the wicked practices of the Church on the door of the Wittenberg chapel on Halloween. He picked his day with care, and ever since, Halloween has also been Reformation Day.

I recommend the whole article to parents who have struggled with this issue. I find, in general, that many Christians give far too much honor to the forces of darkness by cowering in fear. In Christ, the victory has been won. Brethren, let's celebrate!

Tuesday, October 18, 2005

Transparent Aluminum No Longer a Star Trek Fantasy

In Star Trek IV: The Voyage Home, Commander Scott traded the formula for transparent aluminum to a scientist on 20th Century Earth in exchange for the first few sheets to be produced. While I doubt the U.S. Air Force has time travelers as consultants, it is working on what can realistically be called transparent aluminum.

In a press release issued yesterday, the U.S. Air Force Research Laboratory at Wright Patterson Air Force Base in Ohio explained the current state of research and development of aluminum oxynitride (ALON) as “a new kind of transparent armor, stronger and lighter than traditional materials” that would be an improvement on existing multi-layered armored glass now in use.

“ALON is a ceramic compound with a high compressive strength and durability. When polished, it is the premier transparent armor for use in armored vehicles,” said 1st Lt. Joseph La Monica, transparent armor sub-direction lead. "The substance itself is light years ahead of glass," he said, adding that it offers "higher performance and lighter weight."

Complete details: Air Force Testing New Transparent Armor

Thursday, October 13, 2005

Meebo: A New IM Service in the Right Place at the Right Time

Meebo.com, a new web-based Instant Messaging service, seems to have sprung up in the right place at the right time. Only four weeks old, it has caught the attention of such heavy hitters as the Web 2.0 Conference, CNet, Om Malik’s Broadband Blog, MSNBC, the Wall Street Journal, and Mark Jen’s Plaxoed blog.

My first use of the service, still considered an Alpha (early test phase) release by the founders, was good. I could see all my MSN “buddies” just as if I had logged into MSN’s client software, and had a good conversation with an associate.

Because Meebo.com is a web-based instant messaging solution, it allows the user to log into to all the major IM services from a browser instead of having client software loaded on the user’s computer. This is a real boon to travelers, those forbidden to install an IM client on a work computer, or those who must share a computer (think Internet café or college computer lab).

Meebo.com is in the right place at the right time because of the recent announcement by MSN and Yahoo that they would be integrating their IM networks. Today the rumor mill is buzzing that Google and Comcast may jointly try to acquire America Online, another major IM player. Each of the major IM players operate closed networks. You can’t be signed into Microsoft’s MSN IM and chat with someone on ICQ, for example. Google Talk, based on open source technology, is an exception, but its ability to communicate with other IM networks is limited. So, in an emerging technology space in which closed networks are trying to fill the open space between each other, along comes an upstart with a solution. Either the three people behind Meebo.com will take venture capital very soon, or they will be purchased by one of the major players.

Users log on to their favorite IM services at http://www.meebo.com; screen names from all IM services that users have accounts with are retrieved and displayed in one organized window within a single browser page. Current IM services (including AOL Instant Messenger, Yahoo Messenger, Google Talk/Jabber, MSN Messenger, and ICQ) have more than 500 million users worldwide. Meebo allows the user to sign in to use any of these services without downloading software to install on the local computer.

The service was written using Ajax technology, the Web-based software development philosophy that is driving many new applications and services. Ajax makes it easier for developers to offer highly interactive, dynamic services within a Web browser page.

Based in Palo Alto, CA, Meebo was founded by female engineering duo Elaine Wherry and Sandy Jen, two Stanford University-trained engineers with degrees in symbolic systems and computer science respectively. A third co-founder, Seth Sternberg, a former IBM mergers-and-acquisitions lead and Yale graduate, is currently a student at Stanford Graduate School of Business.

Saturday, October 08, 2005

Google Already Has A Browser

[An expanded version of this item, written for design professionals, will be posted at AECnews.com.]

The web, and especially the blogosphere, is filled with rumor and gossip claiming that Google is working on a browser. A recent CNet article, "Clues May Point to Google Browser" is probably the best summary of all the current talk. There's one problem with all the speculation--Google already has a browser. It's called Google Earth.

"The media doesn't get it," said Google Earth software engineer Michael Ashbridge Friday at 3D Base Camp, the users conference sponsored by @Last Software, makers of SketchUp design software. “Nobody in the media who writes about Google Earth gets it. Google Earth is a 3D browser.”

Ashbridge, along with Google colleague Wes Thierry, were at the conference in Boulder, Colorado to show off a new SketchUp plug-in that allows SketchUp 3D models to appear in Google Earth, and for Google Earth georeferenced imagery to appear in a SketchUp model. I attended the event, blogging it for my online journal AECnews.com.

If you haven't already downloaded Google Earth and played with it, you are missing a treat. (Don't bother if your Internet connection is dial-up.) Having the power to view satellite imagery and aerial photography of the entire Earth, at your fingertips, is an incredible experience. But Google Earth is more than a viewer. It has much of the functionality of the existing map-server sites such as Yahoo! Maps, MapQuest and Google Maps. It also has tools to export data and imagery and to import information from other sources, creating new opportunities for leveraging the geographic data.

The combination of Google Earth and SketchUp opens the door to many new uses. Want to know what the planned parking garage will look like on site? If you create a model in SketchUp, you can place it, at true scale and true location, into Google Earth. Some landscape architects already use SketchUp to create accurate depictions of their work; now they can lay their designs into imagery, merging the new park (etc.) into the existing locale.

The new SketchUp plug-in which allows this interaction between the two programs is about a week away from release. A preliminary version is currently available for download from the SketchUp website, but it is more labor intensive and less useful than the new version.

Compared to the best-known programs for architectural design, including AutoCAD, MicroStation, and ArchiCAD, SketchUp is an upstart. It was launched in 1999, designed to be simple to use but deceptively powerful. While it succeeds in bringing 3D to the masses, it now has a new and huge advantage over the larger, more expensive and more complicated professional CAD programs--it is the only one Google Earth supports. When asked, late in the session at 3D Boot Camp, if the ability to place CAD models in Google Earth would extend to other products and formats, Ashbridge replied in his soft Irish lilt, “We really like these SketchUp guys, it’s great stuff. I’m sticking with these guys.”

Thursday, October 06, 2005

Blogging Live from 3D Base Camp

Blogging Live from 3D Base Camp

Over at my online journal AECnews.com, I’m filing regular reports, blog style, from 3D Base Camp, the first users conference for SketchUp, the fast-rising, easy-to-use 3D design software from @Last Software of Boulder, Colorado. I’m writing in-between sessions, and posting photos from the conference to a separate 3D Base Camp photo gallery.

Boulder is an interesting small city. Home to the University of Colorady, it is a little bit of Soho, Berkeley and Seattle’s U-District combined, with a distinctive Rocky Mountain flair. Tonight’s party, for example, will be at a place called House of BlueGrass. Last night I wandered around a bit and found a coffee shop that had a soccer match (from the US professional league) on the big screen TV, and served tea brewed in a pot to order (take a lesson, Starbucks). The Onion, a self-described “farcical newspaper featuring world, national and community news” is available free from news stands. The two founders of @Last Software dress for this conference as they do every day they go to work, in faded jeans and a T-shirt. When a PR consultant was hired to help with conference planning, she asked what the dress code would be. “Don’t go there” was the serious reply.

Creative Commons License
This work is licensed under a Creative Commons License.