By Max Kalehoff
Source: Online Spin
Economic cycles are a fact of life in the advertising business. It's been true since the beginning. But perhaps more predictable than advertising cycles are advertising forecasts. Hey, there goes another one!
The latest significant advertising forecast is from media agency ZenithOptimedia. According to Ad Age's coverage of the report, marketers will spend $195 billion on North American advertising next year, 4.1% more than in 2007, while ad spending worldwide will near $486 billion in 2008 for a 6.7% gain. However, U.S. ad-spending growth this year will reach just 2.5%, far below the 3.7% growth it forecasted last summer and well under 3.6% inflation so far this year. That means that the respective growth of the entire pie is happening elsewhere, particularly in less developed advertising economies.
But that 6.7% growth for 2008 is not so significant when you consider three of advertising's most important catalysts are colliding that year: the U.S. presidential election, the Olympics and a European soccer championship. Once that's over, you can bet all eyes will be on early 2009, seen in many industry circles as a likely peak for the inevitable advertising cycle to begin its downward trend.
Not surprisingly, the Internet is seen as the bright light in this ultimately mediocre forecast. While newspapers, magazines and radio advertising are expected to lose share, "Worldwide Internet ad spending will climb to $44.6 billion from about $36 billion, increasing its share of the market to 9.4% from 8.1%," according to the ZenithOptimedia report.
What's my take on all this? Selfishly, I'm glad I'm positioned in the online subset of the advertising economy. I've been bullish in this area for the past thirteen years. I suffered through the dot-bomb crash like everyone else, but the fundamentals won out and are stronger than ever. But I'm really not an Internet cheerleader or industry groupie; there's something bigger going on.
It's critical to view online advertising not as a safe haven, nor as an advertising subset with solid fundamentals. Online advertising must be viewed as transformative. Sure, online advertising is a bright spot, but it's really a pervasive entrée to digital media ubiquity. It means all of our silo-plagued concepts of media categories -- like Internet, newspaper, television, stuffed into this latest ZenithOptimedia report -- will eventually connect and blur onto the digital grid. That will lead us to adopt models based less on media formats and audience proxies, and more on actual people, content, behaviors, receptivity, relationships and performance.
As that happens, the advertising economy will achieve unprecedented predictive, delivery, and interactive efficiencies. In other words, when most everything is online and connected, the fat and waste will begin to shed, making shrinkage of the total advertising pie a likely scenario. Even the early-stage, overseas advertising markets experiencing the fastest growth will inevitably mature and face this reality.
Oh, about those inevitable advertising cycles? Successive downturns will facilitate long-term advertising contraction by shifting dollars and expectations to those more accountable digital media. Advertising cycles will alter as a greater proportion of marketing dollars circulate back into product, experience, and customer-management functions.
Friday, December 07, 2007
By Max Kalehoff
Source: MIT Technology Review
NEW YORK (AP) -- Based on the weather reports and restaurant listings you check out online, Yahoo Inc. has a good idea where you live. Based on searches you have done, the Web portal might also know where you want to go.
Do not be surprised then to suddenly see an advertisement on flight deals between those two places. It is what United Airlines did with an ad on Yahoo earlier this year as people browsed for something completely unrelated to travel.
Elsewhere, online hangout Facebook is mining friends' buying habits, and major Internet portals have bought companies to expand their reach and capabilities for ''behavioral targeting'' -- all so advertisers can try to hit you with what they believe you're most likely to buy, even as doing so means amassing more data on you.
''When you are online today, you've been labeled and tagged as this type of consumer in milliseconds,'' said Jeff Chester, executive director of the Center for Digital Democracy. ''All of a sudden you are exposed to a vast number of invisible salespeople who are peering over your personal details to figure out the best way to sell to you.''
Behavioral targeting, commonly accomplished by depositing tiny data files on personal computers to keep track of surfing patterns, has raised privacy questions and, at least in the case of Facebook, user complaints.
From the perspective of Web sites and advertisers, though, behavioral targeting can bring to the rest of the Internet some of the relevancy Google Inc. and others successfully mined for billions of dollars with text-based search ads.
''Everyone's trying to find the next, best mousetrap to compete with search,'' said Adam Broitman, director of emerging and creative strategies with ad agency Morpheus Media LLC.
Although behavioral targeting is not right for all advertisers, it has become increasingly important as companies try to break through the clutter. The research company eMarketer projects that spending on behavioral targeting will nearly double to $1 billion (euro680 million) next year and hit $3.8 billion (euro2.57 billion) by 2011.
Targeting has been around as long as there has been advertising. Automakers promote themselves in car magazines. Political campaigns send mail to likely voters. As advertising moved to the Internet, ads for diapers clung to parenting sites.
With search came contextual targeting -- the ability to target messages even more precisely based on search terms or keywords appearing in articles. Behavioral targeting brings capabilities to sites without good or reliable keywords -- for example, a social-networking profile that touches on dozens of hobbies and interests at once.
Read enough golf articles online and a data file will be put on your computer labeling you a golf fan. When you're on a Web site on cooking, don't be surprised if ads for golf clubs follow you there.
The concept has been around for years, but enough Web sites are now participating that advertisers can still reach a sizable group even if they target narrowly.
They can also target more smartly using surfing patterns across a broader set of sites, or in Yahoo's case letting United Airlines customize ads by inserting city pairs and prices specific to the individual user.
Consumers are having trouble understanding all that Web sites are up to, said Ari Schwartz, deputy director of the Center for Democracy and Technology.
The Federal Trade Commission recently held hearings at which consumer-protection and privacy groups including Schwartz' called for the creation of a ''do not track'' list.
The Web portals, particularly Time Warner Inc.'s AOL, are stepping up their educational efforts in response to privacy concerns, trying to sell Internet users on the idea that if they are to see advertising to support free services, a targeted, relevant ad is far less annoying.
They also stress that they are not capturing sensitive information like names and e-mail addresses, and in many cases consumers can take steps to decline targeted ads.
Indeed, companies are not going as far as they could.
''At the end of the day, if behavioral targeting is being used and consumers get annoyed, they are going to take it out on the advertiser or the publisher that placed the ad,'' said Michael Cassidy, chief executive of Undertone Networks, which contracts with a network of third-party sites to run ads.
Most Web sites and marketers have been shunning the ultimate targeting -- ads that greet you by name.
Yahoo could easily do that using registration information, but ''I'm not sure people would like that or not,'' said Richard Frankel, Yahoo's senior director of product marketing.
Many have declined to sell ads based on diseases you've read about.
''We could track them and target ads for sensitive health conditions and get lots of money from pharmaceutical companies for that, but there are certain things we've chosen not to do,'' Dave Morgan, a senior AOL advertising executive who founded Tacoda, a behavioral-targeting company AOL bought in September.
''Sensitive'' includes all targeting to HIV and cancer. Beyond that, health ads are reviewed on a case-by-case basis, Morgan said.
The Network Advertising Initiative, an industry trade group that counts Yahoo along with AOL and Microsoft subsidiaries as members, began a study about seven months ago to clarify the boundaries.
Carl Fremont, whose advertising agency Digitas handles many pharmaceutical campaigns, said several ideas have been shot down by lawyers at drug companies. He would not name any, and three major drug companies did not respond to requests for comment.
Fremont, the agency's executive vice president, said he sees opportunities in reminding known patients to take their medication or perhaps try a competing drug.
Despite the promises, targeting isn't always appropriate for Web sites and advertisers.
Although sites can charge more per targeted ad, fewer readers see any given one, resulting in less revenue in some cases, said Brian Quinn, vice president of ad sales for The Wall Street Journal's Web site and three other Dow Jones & Co. properties.
''We just end up selling the frosting and not the whole cake,'' he said.
And while targeting may work for cars and travel, it's more difficult to gauge a person's interest in soda or satellite TV service. In such cases advertisers often prefer mass amounts of cheap, untargeted spots, said Sarah Fay, chief executive of ad agency Carat.
Advertisers and Web sites also have to figure out how far they can push without alienating their users.
Many Facebook users, initially unaware that tracking was occurring, have complained that the site went too far in generating endorsements based on friends' online activities. Last Thursday, Facebook agreed to offer better notification and controls, and the company is counting on users to feel more comfortable about the tracking over time.
Users' comfort with data profiling has indeed shifted over the years.
Google faced criticism when it introduced an e-mail service that paired ads with the words inside private messages. Millions of people now use Gmail with scarcely a blink.
Users will eventually embrace the latest tactics, too -- and by then, they'll complain about even deeper levels of intimacy yet to be invented, said Tracy Ryan, professor of advertising research at Virginia Commonwealth University.
''You want to have enough targeting that a consumer notices the message and pays attention, but you don't want it to be so obvious that they are thinking (there) is targeting,'' she said. ''That would be scary.''
Adobe's AIR allows applications to bridge the Web and desktop.
By Erica Naone
Source: MIT Technology Review
More and more applications that used to be tied to the desktop are now available on the Web; well-known examples include Google Docs, the search giant's online word-processing program. (See "Google's Cloud Looms Large.") But as people become used to running applications and storing their data online, Adobe is working on technology that will bring those applications back to the desktop again, transformed by their time in the "cloud." Called Adobe Integrated Runtime (AIR), the technology is intended to help developers take advantage of the strengths of both Web-based applications and traditional desktop applications.
Adrian Ludwig, group product market manager for the technology, explains that AIR applications can run whether the user is connected to the Internet or not. Like desktop applications, AIR applications will be able to store information locally on the user's computer. Users can also access data from the desktop: they can easily drag and drop information into and out of an AIR application. When the user is online, the applications can function as Web-based programs do, giving instant access to up-to-date information, regardless of what computer is used to retrieve it. When the user is offline, the AIR applications continue to run, waiting to synchronize data until the user connects again.
"We're blending together the desktop and the Web application," Ludwig says.
Users don't have to be particularly aware of AIR, he says. Like Adobe's ubiquitous Flash player, AIR is a plug-in that users download the first time they want to use an AIR application. All subsequent AIR applications can be downloaded without any additional fuss. And like Web pages, AIR applications are designed to work regardless of what type of computer a user has; the programs will work on computers running either Windows or Mac OS. (Adobe is also working on a version of AIR for Linux.)
Several companies have already built applications using AIR, including eBay, Nasdaq, and fashion retailer Anthropologie. These early applications illustrate a variety of approaches to using AIR. The eBay application, for example, is designed for power users who want access to the auction site at all times, Ludwig says. Not designed with offline features, the application uses its integration with the desktop to feed alerts to the user, such as a bouncing icon in the dock on a Macintosh when someone bids in an auction that a user is following.
An application designed by Allurent for Anthropologie, however, focuses on the offline experience. The application downloads the company's catalog onto a user's computer and keeps it updated when online, but it contains offline search features that Ludwig says tend to run faster on the desktop than over the Web. The user can even create offline orders that will be processed the next time the application connects to the Web.
Joshua Rand, CEO of Sapotek, the company that makes the Web-based operating system Desktop Two, says that technologies like AIR are good for cloud computing because they make it simpler for users to keep their data consistent. (See "Computer in the Cloud.") While Web-based services such as Desktop Two allow users to access the same data from multiple machines, Rand says that he thinks it's a significant improvement to be able to work offline, without having to take additional steps to keep the data the same. Sapotek is currently working on a version of Desktop Two that uses AIR.
Jeffrey Hammond, an analyst at Forrester Research, says that if Adobe AIR is widely adopted, he expects it to allow developers to design far more adventurous applications. "The challenge will be making the plug-in ubiquitous," he says, noting that Adobe has largely succeeded in meeting the same challenge with Flash. Although there are products now available that have some similar features, such as Microsoft's Silverlight and Google Gears, Hammond says that he doesn't consider them direct competitors with AIR.
Source: IT Director
David Norfolk, Senior Analyst, Development, Bloor Research
Published: 7th December 2007
Depending on who you talk to, SOA is either the coming thing or a marketing ploy based on an acronym, some technology and little else. In one view, we have organisations running on shared business services, delivering business outcomes and supported by technology that can be insourced, outsourced or upgraded at will, as long as the service contract doesn't change. In the other view, we have standards-based chaos, in which services are built "right" using all the latest acronym-ware—ESB, WS-this, WS-that, Web 2.0—but whether you get the right services, built right, is largely a matter of chance.
As usual, the truth is somewhere in the middle and where you are on the continuum depends on both your corporate and SOA maturity. The focus should be on building a service-oriented business and then using service-oriented technology, where appropriate, to support it. We are reminded of that other great experiment, the shared corporate database. Yes, in the days before Master Data Management, there really were companies that stored all their data in one place, logically, updated it in one place and made "quality assured" data, and its metadata, available everywhere it was needed. We now need MDM because programmers decided that shared databases were a bad idea because they slowed down coding: for a start, before you changed a field format or semantics you had to talk with its other users and apply change management and similar "overhead" processes! So, you meet people writing Web 2.0 who even suggest that giving everyone their own database is "good practice"—David Heinemeier Hansson, creator of Ruby on Rails, for example, talking to Tim Anderson: "people instead should have one database per application, and if you need to share a database or share data, expose that as services". Many of the issues which killed the shared database for a time could kill shared services:
Politics - a powerful manager can't really "own" a shared service but may want to.
People issues - changing for shared services needs communication and negotiation skills, an area IT is not noted for.
Short termism - developing shared services for reuse needs an initial investment (in the case of databases, a DBA and a data analysis team; for services, perhaps a centre of competence and SOA architect). If you only measure project by project cost, you will never see the life cycle ROI gained as a whole project portfolio takes advantage of that initial investment.
The silo mentality - "our project is the only one that really matters and sharing resources will slow us down"
Rewarding short term technical exploits instead of the longterm delivery of effective business outcomes.
Now is the time to learn from the past and make sure that SOA doesn't suffer from the same dysfunctional issues.
One company that seems to be helping companies with this is BEA (we're sure there are others; or, at least, we hope there are). We've just come from a meeting with BEA in which technology was hardly mentioned, let alone BEA products, and the talk was all about governance, architecture, people issues and organisational structures.
We were talking with Danny Healy (EMEA Technical Director at BEA) and Phil Head (UK Practice Director at BEA). They told us that BEA's consulting arm was now 40% architectural (as opposed to 60% technical), including many new hires. The new people are dealing with customers who say "I want to do SOA" (we'd be happier if such people put this somewhat differently; perhaps, "I want to achieve this business outcome and I think SOA is the way to do it"—but we suppose that that's what the BEA consultants address first); and "I need to solve this business problem". And, of course, there are still many who say "I've bought BEA" and still just need technical consultants.
Healy and Head made the point that all of the issues are, in fact, interrelated; and that you have to take a holistic approach to them—but that they don't all have to be solved at the same time. Prioritisation is your friend.
Many of the issues, they say, can be lumped together under one heading: SOA Governance. But different companies will need different governance models, they add, because what you need will depend on:
the internal characteristics of your particular organisation—the organisational and governance status quo; your corporate and SOA maturity levels (in essence are you a metrics-focussed organisation without a dysfunctional "blame culture", and are you happy thinking in terms of services and contracts in connection with IT);
geography and external organisational relationships (managing service levels from people not on site—or perhaps not even employed by you can be tricky);
the business environment; and so on.
If we ourselves had to pick on one killer issue, we'd pick corporate maturity; and Head and Healy say that evaluating this is often the first thing to do—even prior to engagement.
BEA suggests that a mature organisation will implement end-to-end SOA lifecycle governance—and that many companies that didn't do this first time around are now revisiting the issue. SOA governance is a vital part of producing services with a service contract that people can trust and the underlying model will include metrics and scorecards that keep SOA alive; the services architecture itself; and the underlying organisational governance of roles and rewards necessary to institutionalising SOA as an active part of the organisation. When introducing SOA many organisations concentrate on overcoming inertia with respect to change; but continuing efforts to institutionalise a services culture will be necessary, to overcome "homeostasis" (the tendency over time for a stable working system to return to its initial state when disturbed).
BEA now has experience of many large companies implementing SOA governance effectively. The specific implementation details aren't usually made public, but the general characteristics of these efforts (from an NDA briefing) are interesting. For a start, there is a definite timeline, delivering the SOA-based organisation incrementally. In the early stages, buy-in at all levels is important, training may be necessary, and the strategic architecture can be formulated. At later stages, the first services can be delivered and evaluated; at the very end, only when many services are available, can you usefully address service portfolio management. It is important to balance local innovation against enterprise co-ordination—and you must make sure that your organisational structures and roles support what you are trying to do. And, overall, you mustn't lose sight of the reusable business services that are the ultimate objective of SOA.
Personalised decisioning is the science of the knowledge economy.
Source: Online Spin
By Dave Morgan
You don't have to have been in the digital media business for a very long time to realize that we are in a period of extraordinary change. Over the past two years, there have been tectonic shifts in everything from enabling technologies to user behavior to business models to regulatory scrutiny. Two years ago, social networks were emerging ideas. Today they are dominating all others in user growth and engagement. Two years ago Google was an exciting, fast-growing search company that was just starting to really crank up a text ad network. Today, it is a dominating digital media company -- with a capital "D" -- with a $16 billion annualized revenue run rate and regulators in the U.S. and abroad scrutinizing many of its moves. Two years ago, a widget was what companies in business school case studies made. Today, consumers interact with Web widgets to the tune of billions a day, and they're starting to show up on mobile devices as well.
What is driving these changes? Here are some of the drivers:
Changing user behaviors. Users are no longer just following forced navigation paths foisted on them by their Internet Service Providers or limiting their browsing to bookmarked "favorites." Instead, they are exploring and searching more and more. They are interacting with and consuming content from millions of new and emergent Web sites. Every day, millions of them create content, share content, critique content. Their browsing patterns are becoming much more about them. This means dramatic shifts in audience fragmentation for Web publishers. It means extraordinary pressure on the branded portals and vertical sites that have dominated user attention for most of the past ten years, but are now losing their hold on their audiences.
Social networks. The new foundational technologies, the adoption of much more straightforward user interfaces -- thank you, Google -- the proliferation of portable, object-oriented Web applications, and the desire for users to have persistent connections with others, have all driven the development of dynamic social network platforms. The recent explosion of MySpace, Facebook and Bebo is just the start. Users are making these networks their own places, they are building profile pages and customized news feeds that are all about them -- not all about just what publishers want to push to them.
Ad network proliferation. As audience fragmentation accelerates, the need for advertisers and agencies to aggregate target audiences at scale increases in parallel. Targeted reach and frequency at scale now requires massive ad networks with robust targeting. It is becoming less and less about delivering ads to particular places -- pages, products or programs -- and more about delivering ads to particular groups of people. This will only accelerate as we see more adoption of Internet-Protocol-driven mobile and television platforms, and more user-centric advertising on more digital devices.
What's next? Content orbiting is coming. I think that we are very soon going to see massively scaled content, social and Web service networks spring up much in the same mold as today's online ad networks. They will take over parts of the pages of widely distributed networks of thousands and millions of sites and "orbit" content around users, just as ad networks "orbit" ads around users. We are already seeing Google and other large players get into the content syndication business. We will very soon see the syndication players become "content orbiters," shifting their models from placing content on pages to targeting highly relevant content to users, on whatever pages they happen to be surfing. What do you think?
Thursday, December 06, 2007
Source: IT Director: David Norfolk, Senior Analyst, Development, Bloor Research
Published: 5th December 2007
Every so often, you come across what seems like a genuinely disruptive (in the sense that it makes you rethink what is "good practice") technology and Erudine's "behaviour engine" may be a case in point. In essence, it turns the conventional approach to development on its head: instead of working out what the product should do (in conjunction with end users, if you're lucky), designing test cases for these "requirements" and then building and testing the system, with Erudine you look at the existing business system, whether manual or automated, and collect its inputs and the business outcomes corresponding to them. The Erudine engine then assures you that these inputs will always result in these outcomes and, after a confidence-building period of parallel running, you can cut over to the Erudine engine for production.
So far so good. You have an executable model of the status quo—where's the business return in that? Well, automating a manual system (or developing a new one) brings obvious efficiency benefits if you've got the requirements right. And the Erudine engine runs on modern hardware platforms and this may let you retire a legacy platform. However the main return comes from the fact that the Erudine "behaviour engine" model is easily maintainable, whereas your status quo probably isn't (and Erudine would claim its models are generally more maintainable than alternative technologies). If you don't like behaviour in your current model—perhaps it tells you to politely refuse a customer with a poor credit rating, when you'd like it to initiate a call to the local police station—you simply tell it about this new behaviour and explain why it should now be invoked. For example, something like: "...IF bad customer (IF customer is also carrying a weapon THEN call police ELSE turn away nicely) THEN reject loan...". This new behaviour may well be incompatible with other behaviours, of course—in which case, the contradictions will be brought up and you will explain which behaviour is "correct" in each case and which factors drive this decision. You can take a legacy system with implicit and/or embedded, undocumented, domain knowledge that makes it largely unmaintainable and transform it into a system that behaves in the same way but with a behaviour that can easily be changed with the application of current domain knowledge at the point of change.
And this doesn't just apply to legacy modernisation. If you want to build a "requirements model" for behaviours you don't have yet, Erudine can implement its behaviour and, if you think about the description above, it is easy to implement the most common behaviours first, so confidence in the model rises rapidly. And your new system will be maintainable, because the Behaviour Engine is, in effect, a complete set of regression tests for the "status quo" system behaviour. If the behaviour changes, this will be because you've told it to and added the factors driving the new behaviour.
The devil is in the detail, of course, and there is a sophisticated underlying mathematical model of behaviour and the automatic consistency checking based on "knowledge management" with "conceptual graphs". Erudine keeps this mathematical underpinning largely private; it's the company's prime intellectual property. If you want to learn more, Erudine can supply (no prescription needed) a trial ‘headache cure’, a package containing a USB software capsule containing a demo version of the Behaviour Engine, various white papers and training documents and even a full version of the Engine. It would be unfair to say that this pill guarantees you a good night's sleep—but it does contain plenty of serious reading.
Erudine's secretiveness about the mathematics behind its engine was a stumbling block in its early days and most of its first customers came to it in acute pain—with a legacy system, typically, that they could neither maintain, nor emulate, nor replace. This overcame any risk associated with adopting a solution from a comparatively unknown company and Erudine demonstrated on several occasions that it could turn failure into success. It could also demonstrate that even if the internal workings of the Engine itself weren't accessible, the end result documented the behaviour of the customer's organisation very clearly and transparently This was when we first met Erudine and you can see our initial thoughts here.
The company is now entering stage two of its evolution. It has documented collateral showing that it works, reasonable capital investment and a strong partnership with EADS, delivering decision support systems in the defence and security space. It's no longer just replacing legacy but increasingly being used as an innovative way of developing and maintaining new systems. However, it still has a consultancy arm, actually developing Erudine solutions for its customers—Erudine is new technology and its customers don't feel entirely happy taking it in house (or, perhaps it is thinking of behaviours and knowledge that worries them).
Stage three of the evolution will come when customers feel comfortable taking the Behaviour Engine entirely in-house and working with the concept of behaviour for themselves. Erudine will then be where it wants to be—developing the Behaviour Engine as a routine development tool. But will the Erudine Behaviour Engine become the status quo for software development? Experience suggests, probably not, as new technologies have seldom entirely replaced old ones in IT so far. Possibly only the most mature companies, capable of managing risk and organisational behaviours, will take it up, although if you add in companies with real pain, that's a potentially large market. There may also be an opportunity in emerging economies which have to catch up with the developed economies quickly; and where managers probably have less intellectual ‘baggage’ holding them back and making them risk-averse.
Nevertheless, remember that this isn't a formal evaluation of the Erudine Behaviour Engine; it's a ‘heads up’ for an interesting new technology that is maturing nicely. However, we are writing about it now, because it seems to fit well with another disruptive influence that we find interesting currently, ITIL v3, which talks about "integrating IT with the business" and focuses on "business outcomes" instead of requirements. This fits well with the way you develop with Erudine, as the model's behaviour is driven by domain experts in the business and "behaviour" maps more immediately onto "business outcome" than the average IT requirements document ever does.
The degree of integration of IT into the business Erudine may catalyse was really brought home to us in a conversation with Phil Rice (CTO at Erudine) about the ethics of decision support in a life threatening situation—does the Behaviour Engine embody the ethics of the organisation or does the person at the sharp end make their own ethical decisions? And if they do, should this (or if the decision was bad, with hindsight, its opposite) be fed back into the Behaviour Engine? People often make the wrong decisions in the stress of the moment and, as the world seems to be becoming a more dangerous place, perhaps Erudine will find itself the right technology in the right place over the next few years.
Source: Google Zeitgeist The 'Fastest Rising' Social Network is... Webkinz
by Rob Garner , Wednesday, December 5, 2007
MARISSA MAYER'S Google Zeitgeist 2007 / Trends announcement yesterday makes a strong case that kids have already inherited the Web. Five social media portals and networks made the list of the top ten fastest-rising search terms. But the fastest-rising social media network isn't Facebook, YouTube, or MySpace. It's a social networking site for beanie-baby-like plush toys, and their respective preschool-to-preteen owners, called Webkinz (it's at no. 2, just behind "iPhone"). If you're not familiar with Webkinz, it's a sort of technically primitive (and safe) Second Life for kids, and the price for entry is buying one of the plushes, then logging in and socializing with other online animals in their own virtual world that also includes their own little rooms and a variety of games.
The Webkinz phenomenon comes as little surprise to me; I spent the better part of the last year getting nudged off the home laptop by my two young children, who consider Webkinz to be as good as, or better than, any toy or kid's show on TV. As a potential sign of things to come, my daughter is particularly adept at taking her little Chihuahua on shopping sprees at the Webkinz store, where her purchases have included a virtual widescreen TV, cowboy boots and a rainbow wig. She also has a buddy list, often sends gifts and correspondence to her little brother, and is thrilled when she gets her own PMs from her dear old dad. My son ("Ribbit the frog") likes to play the games section of Webkinz, and he often has me play along with him. My wife ("Pinkie poodle") has even gotten sucked into various games, and I often catch her "working" to help the kids earn Webkinz Cash by playing some of the online games.
At work, social media is about a whole other list of players, but at home, it's all Webkinz. Based on current valuations and investments into other hot social networks, that must make Webkinz worth about, oh, $20 or $30 billion dollars. Who would have ever thought it?
There was more to Marissa Mayer's announcement yesterday, so without further ado, here is the entire U.S. Top 10 list of fastest-rising search terms in 2007:
6. Club Penguin
10. Anna Nicole Smith
Mayer also took some extra time in the presentation to explain how to use Google Trends. Google Trends has been around for some time now in its current state, so it seems that much of the press conference was to draw mainstream attention to the tool, which shows individual search terms and phrase history in Google over time. But she also pointed out some interesting observations to illustrate how it could be used:
- Teachers all around the world are pleased to find that "math" is still more popular than "Paris Hilton" and "Britney Spears."
- Google Trends may be a "political oracle" in that it tracks top-of-mind interest in political candidates, and could potentially predict the outcome of the upcoming elections ("Hillary Clinton" is leading other Democratic candidates at the moment).
- A friend of Mayer's runs a boutique, and uses Google Trends to find out the popularity of various labels of jeans when making a decision on what to stock.
As noted in my previous column, Google Trends is an essential tool for all online marketers. While no exact frequency metrics are available, the data comes straight from the source, and can be particularly revealing about what people are actually thinking about your brand, or how popular it really is.
One exercise I have gone through with some of my own clients is to ping relative brand terms against the competition as a measure of brand popularity and interest in search. The bottom line is that Google Trends doesn't lie. If your brand or company name is behind the competition, then there is some serious branding work to be done -- offline and online.
To get an idea of how to start playing around with Google Trends, here are a few sample comparisons -- click through to see which brands and searches are the most popular in Google over time. You may be surprised:
- Coke vs. Pepsi
- "search engine optimization" vs. "search engine marketing"
- Beyonce vs. Bach
- Websites vs. widgets
If you've never played around Google Trends, have fun with it. As for me, I'm off to spend some more quality social-networking time with the kids.
Source: Data Strategy Journal
According to Eric Newcomer and Greg Lomow, authors of Understanding SOA with Web Services, service oriented architecture is “an architectural style that guides all aspects of creating and using business processes, packaged as services, throughout their lifecycle, as well as defining and provisioning the IT infrastructure that allows different applications to exchange data and participate in business processes regardless of the operating systems or programming languages underlying those applications.”
Although this definition seems to have a reasonable amount of industry acceptance, a variety of definitions exist, and you may see more than one reflected in the articles presented in this issue of the Data Strategy Journal. Regardless of definition, it seems everyone agrees that SOA is vast. SOA is more than IT or web services or a set of technologies; it is an organizational matrix that is applicable across the board - and it can be confusing.
In the hopes of providing you and your organization with insight into effective SOA implementation, we asked prominent consultants in the world of data services and service oriented architecture to answer a few key questions (via e-mail). Together these participants constitute this month’s Experts' Roundtable. Below you will find our experts and their answers.
MEMBERS OF THIS MONTH’S DSJ EXPERTS' ROUNDTABLE
Jim Bean - TOGAF 8 Certified Architect and author of four books on the topics of data modeling, database design, XML, Web Services, reuse, and Global Standards
Andrew Flower - Founder and lead consultant for Right Triangle Consulting
Ken Karacsony - Author, lecturer, Information Strategist and IT consultant
Joe LaFeir - Vice President of product development & CTO for RLPTechnologies
Doug Stacey - Data Architect in the Information Architecture group at Allstate Insurance Co
D.G.: What's the most common first step for organizations getting started with SOA? Is there a particular type of project that seems to break the ice better than others? (I.e. where is the low hanging fruit?)
The most important first step is creating the underlying infrastructure and governance to support SOA in your organization. Most people won’t want to hear this because it is difficult to sell, but the business will want to see results and what I am suggesting does not get them there immediately. SOA is not about any one single project, but rather an IT application framework and approach for delivering solutions to the business. Establishing the technical infrastructure and governance is a key first step to support an SOA strategy.
Coupling the infrastructure and governance with one or two small initiatives such as wrapping a high demand stable legacy application as a service or integrating a commercial business application as a service into your environment are ways to get the gears to turn. I would stay away from major re-engineering efforts as your entry into SOA. If a large initiative is required to get funding, create a small proof of concept early in the project to both prove the technology in your environment and to give your staff an opportunity to use the technology in a safe setting.
The reality of IT is that every IT related activity must demonstrate immediate value or people will question and resist it. The challenges involved in implementing SOA are significant and they will not be accepted unless there is some benefit realized fairly early on in the process. Show the rewards to the enterprise quickly by starting with business objects that have a broad enterprise appeal. Those business concepts and processes that are used by many different business groups are prime candidates for early SOA implementation. Frequently accessed business information means immediate return on an SOA implementation effort.
SOA is an enormous effort – beware being over-zealous in implementing SOA. History has demonstrated that the “big-bang” approach in IT rarely works. Small, incremental changes have a greater opportunity for success because they are more manageable. Fortunately, the incremental approach works well within the context of SOA because the architecture allows the company to implement one service at a time.
From a Data Services perspective there is no harm in starting piecemeal, on a project by project basis. Services to expose enterprise data can be built as needed. The important point when doing so is to take into consideration the ‘enterprise view’ when designing the service interface; trying to anticipate the needs of your broader consumer base rather than designing what I call “point to point” services.
D.G.: To what extent does SOA require a different mindset to succeed? If you could change the mindset of our readership to better prepare them for their first SOA project, how would you do it?
A large part of the integration problem is the result of our approach to rapid application development and short sighted solution to business problems. We have learned quite well how to deliver quickly, but often at the cost of redundancy, isolated applications, and an inability to share and reuse our data assets. When all facets of a well-defined SOA are properly applied, we have a method of resolving the integration problem and building an enterprise architecture that is effective, fluid, and articulate.
However, SOA is not just another development method or a set of technologies. It is an architecture that combines critical principles, standards, technical capabilities, and governance. If we retain our current development mindset such that we can continue to build and deploy more application and data solutions faster and without principles, standards, etc., we will perpetuate the integration problem and, over time, we’ll see an escalation in technology costs rather than the desired solution.
SOA is not about any one project it is an approach to delivering IT solutions. The approach delivers increasing value over time as the leverage of integrated services is realized and time to deliver solutions to the business is reduced. Implementing SOA will require a good degree of patience and discipline.
For many companies, SOA is a radical departure from traditional architectures that are based on tightly coupled application interfaces. Consequently, there may be a steep learning curve to understanding SOA. Training and education are absolutely essential to flatten the curve.
A top-down training approach is recommended. First, educate senior management on the fundamental tenets of SOA and the benefits of deploying it. This is critical. If the CIO, for example, is unable to grasp the basic methodologies and goals of the architecture, then he will not be able to support it.
Once you have trained upper management, proceed to lower-level managers. They must not only be educated in the overall goals and philosophy of SOA, but also trained in its practical details and how it will be implemented.
Finally, train your staff on the specifics of building and deploying SOA. This granular level of training needs to address the specific technologies to support the company's move to SOA. This will require the greatest amount of training.
Keep in mind that the initial training may not be a resounding success. The concepts of SOA are foreign to many IT professionals, who are probably more familiar with other architectural models.
Comprehending a new paradigm is often difficult. Futurist Joel Barker refers to this syndrome as the "paradigm effect." He explains that most people have certain boundaries that govern their perceptions of the world. When a new theory tests those boundaries, people may reject it because it doesn't fit in nicely with what they believe.
Conquering the paradigm effect requires commitment from management and a thorough training and education campaign.
Unless you take an enterprise view, rather than a project view, you’ll end up with a confusing mish mash of services with very low reuse and much analysis required down the line to figure out what existing services to reuse. The change in mindset may be one of “service governance,” or data governance, to insure that the most value is obtained from your SOA efforts. Otherwise, the effort could fail from an inability to demonstrate benefits and be tossed on the pile of past technology failures that held big promise but didn’t deliver.
The payoff is in reuse. If development continues to result in more services being built every time a new request comes along, the promise of SOA will not be realized. Service governance must be used to contain the proliferation of potentially redundant services. This doesn’t happen all by itself. It requires a concerted effort to inventory your services as they are built and to review that inventory before any new service is built to insure you aren’t overlapping the functionality of an existing service.
SOA offers the opportunity to do things we have always wanted to do: develop common data access, integration, quality services that are shareable and reusable. The same data governance principles apply, but now we have to consider function governance. Yes, the data professional has to think about the verbs (function) as well as the nouns. But we’ve been fighting for and pushing data governance and an enterprise perspective in our organizations for some time now, so we are well equipped to broaden our focus to other aspects that facilitate a more efficient enterprise.
D.G.: A big percentage of our readership is focused on data management. What are the key “data” issues that need to be accomplished for SOA success?
There are a number of data challenges that must be recognized and resolved within our SOA. First is the recognition that in the moderate to large enterprise we often have numerous silo'ed (vertical) data implementations. Each data implementation may have different naming, taxonomy, typing, metadata, and platform implementations. SOA and data services can help to resolve this integration challenge. This is the loosely coupled, interoperable, and standards compliant movement, exchange, sharing, and reuse of data. As data practitioners, we need to extend our practice to address these data issues within our SOA strategy.
Second, all SOA collaborations (that is, collaboration as regards the interaction of consumers and services) rely upon a well-defined, loosely coupled, interoperable, and standards compliant interface. That interface definition “is” metadata. WSDL, XML Schemas, SOAP, XML, service interface design, and canonical message models need to become core to the data practitioner lexicon and offerings.
The key data issue to deal with in terms of SOA is the concept of data in motion. With SOA the idea is to build composite applications by combining services to satisfy a business need, the loose coupling of services which allows for this flexibility will also require practitioners to pay particular attention to the modeling of the data that flows into and out of the services. Through the use of XML, process automation, and service orchestration technology, data management professionals have the technology tools to creatively distribute data to services and aggregate data from services.
SOA is not a replacement for fundamental data management techniques. Indeed, the benefits of SOA are predicated on intelligent data management. There are some keys to developing a thoughtful data management strategy. These include:
• Minimize data redundancy
• Establish information quality guidelines
• Assign Information stewardship
• Develop enterprise data models
• Create Meta data management
Just as redundancy in services makes absolutely no sense, neither does redundancy of data. Consolidate redundant databases where possible, and develop a common understanding of the data. This will greatly facilitate your move to SOA by simplifying the development, testing, and deployment of services for accessing the company’s single version of truth.
Data quality must also be addressed. First, it is important to understand the extent of data quality issues in the organization. A data quality effort should be undertaken to determine what data quality problems exist within the enterprise. Understanding these is critical in designing your SOA services, since it may point you towards areas of the business that can be reengineered.
Companies are finally beginning to understand the importance of information stewardship in managing data effectively. In support of SOA, a team of data stewards should be assembled around the major subject areas of the organization (e.g. Finance, Sales, Inventory, Customer, etc.). This will be a highly knowledgeable team of subject matter experts who will assist your information architecture team in achieving a common standard around data structure, meaning, and quality.
It is important to define the data in the organization to support the development of services in the framework of SOA. This is accomplished by developing enterprise-level data models that represent the real world objects in the company. This is a major effort and does not need to be done all at once. You may decide to perform your modeling activities in conjunction with the services that are being designed.
For example, if you are designing a service called “get_customer” you will need to model the customer subject area. There is a caveat, however. Your modeling effort must include representatives from the entire organization who rely on customer information. Remember, services are being designed for the enterprise which means that the data must also be designed – modeled – for the enterprise.
Meta data basically means “data about data.” Services will depend heavily on the definition and understanding of the enterprise data. This information should be centrally managed in a meta data repository. Common business terms and definitions must have consensus in the organization. Achieving this consensus is a task for your information architecture team and your subject area data stewards.
Data standardization and quality is very important. SOA gives us greater flexibility to use data, but that age old problem creeps in if that data is a bear to integrate.
D.G.: Our other audience represents the business. What do they need to do to help SOA succeed?
The best advice I can give here is a little patience during the early stages. Rushing the first application or two out will result in compromising the underlying infrastructure and will show up in the flexibility, or lack of, down the road. Early projects will be burdened with both the learning curve and likely take on a large portion of the infrastructure tax. Think of it like investing in the stock market.
SOA is not simply an IT initiative. That is to say that SOA requires strong collaboration between both the business and IT. In SOA, there is an inseparable connection between the business processes and the services that support the processes.
The business must be prepared and ready to improve and standardize the business processes in conjunction with implementing services. All too often, there is redundancy between similar business processes with subtle variations. This makes designing for SOA reuse difficult because it is impractical to develop an architecture based on subtle variation. That is why it is imperative to explore and redesign business processes and remove subtle variations for similar processes.
In addition to partnering with IT, the business also must understand that SOA is not a “quick-fix” and will take time to develop and implement. The CIO must communicate to the other “Cs” of the company that SOA is an investment in future savings. There is an initial cost associated with changing architectures to the services model, but the investment will pay huge dividends in supporting the organization later by establishing a stable and flexible architecture that will enable the company to adapt quickly to changing market conditions.
The business needs to be tolerant of the governance function sometimes dictating a solution that isn’t the absolute quickest way from point A to point B. In the long run they will benefit many times over from decreased maintenance costs and quicker application modification/development to respond to changing business needs. That will not be achieved however with a “cowboy” development approach with no participation from the data management professionals who have the ability to apply some governance over the development of services.
Value your data equally as your functionality. Historically, too much emphasis has been placed on applications that do specific things without considering the enterprises need to share data. When building and buying new systems, always consider how well the solution adheres to your enterprise data architecture. An overemphasis on the function (whether it is encapsulated in a service or not) ignores, in my opinion, the more valuable asset to the business- the data.
D.G.: What key skills and disciplines do data management professionals bring to SOA? How can they best leverage this experience into SOA professional success?
The service interface is critical to SOA success. The data practitioner has strong expertise in areas of data architecture, modeling, and scriptable metadata languages (as in DDL, etc.). These same skills need to be extended to include the similar and extended needs of SOA. The data practitioner must become an expert with WSDL, XML Schemas, SOAP, XML, interface design, and canonical message modeling.
The data practitioner is already skilled in the application of specific data models and database implementations. The next step is to augment the current skill set with data mapping and rationalization skills to help resolve data and data source integration challenges. A strong understanding of supporting transformation and translation technologies is a big plus.
Good data management fundamentals are still critical to the success of SOA based projects. Modeling data in motion can be a challenge; however, with solid data management discipline to drive consistency and structure to data, data management professionals can certainly be successful.
Skills such as logical and physical data modeling are absolutely invaluable to implementing SOA. It is important that the IT department work with the business to clearly identify and define all of the information requirements of the business. This is done as part of the conceptual and logical modeling sessions. Once the conceptual and logical modeling activities are complete, it is important to define the physical data models in order to support the development of services.
Data management professionals are often more used to looking at the big picture, of planning for the future when implementing for the present. The insistence of precise semantic meanings for data is second nature for a data management professional yet so critical to implementing data services in SOA.
D.G.: Any suggestions on how to scope an SOA project from a data perspective? What are the questions that need to be asked?
Because SOA is not about any single project, we need to look at the data in terms of a functional area of the business or the enterprise as a whole. While each project has its unique use or function it will be responsible for delivering, the true leverage of SOA will likely not be achieved if the data perspective of individual projects is too narrow. The data perspective should be broad with consideration for how the business produces and consumes data. Good fundamental data management practices still apply.
From a data perspective, scoping for SOA is no different than scoping for any other IT related project. Activities such as conceptual, logical, and physical modeling are integral to SOA success, so they must be planned for. Additionally, data architecture is very much required -- defining the “As-Is” and “To-Be” architectures that support the SOA initiative.
D.G.: What's the first tool investment a customer should make and why? (Looking for tool "category" rather than a specific vendor recommendation.)
The first tools decision and investment will likely be the use of an Enterprise Service Bus (ESB). If your technology strategy is to use an ESB to manage the integration of services it should be one of your early decisions. When considering an ESB also consider vendor capabilities for process automation. Understanding the requirements of the business is essential to selecting the appropriate technology.
I don’t think there is just one category to start with. I think you have to look at four things equally: Data access and integration services, function/application services, process orchestration services, and a technology platform to deploy these services.
Data access and integration services are probably nearer to the heart for us data professionals. This really is an introduction of deploying ETL technology, Messaging, Data Standardization, and EII solutions as services.
Function/application services are probably what most people think of first when they hear the acronym SOA. This is the deployment of application functionality as services.
Process orchestration is what makes services useful. Having a pool of services (function and data services) available is great but if I can’t orchestrate them and re-orchestrate them to execute a business process then I’m not getting the promised flexibility desired from SOA.
Finally, you need an application server and related software to manage and deploy these services where they can be found, as well as execute the services.
D.G.: When SOA projects fail, what are the most common reasons?
There are several easily identifiable causes of SOA failure. The first is not having a solid SOA definition, vision and strategy. If you do not understand what SOA is, how it can benefit your enterprise, and/or what you need to get there, failure will no doubt result.
Another cause for failure is letting your SOA strategy be led entirely by technicians. Technicians are strong partners and contributors to the SOA strategy. They have requisite skills and expertise. However, a successful SOA strategy will be led by business and technical “visionaries” and with strong contributions from a number of technical disciplines.
Another reason for failure is an over-riding emphasis on specific development practices or technology. If the SOA strategy is led by a heavily biased technical development team, failure is likely. SOA is not about any single or specific technology. Rather, it is about development principles, integration, reuse, standards, and governance. The real value of SOA is in resolving the integration and interoperability challenges – not in promoting a single technology or development platform.
When project becomes too focused on technology, new and emerging technology and technology concepts tend to attract the highly skilled technologist who can often dominate the direction of a project. Be sure to offset the strong influence of these technologists with strong project management and business representation.
Failing to plan for SOA is synonymous with “planning to fail,” because SOA is too big to work on without a plan or strategy. An organization needs to fight the urge to deliver quickly and spend the necessary time in the planning phase. This is one of the main goals of your SOA team. You may also need outside assistance from a consulting team to help plan. Time spent in the planning stage will pay huge dividends down the road. Do not try to build SOA too quickly – it is a complex solution to a complex set of problems. Rushing the design and implementation may only make things worse.
Most project failures are the product of not having a clear, well thought out strategy and a framework for making decisions which recognize that what you are doing impacts and is impacted by the rest of your enterprise. SOA forces an organization to think more about how their project is related to the whole enterprise. A SOA project can be successful in narrowly defined terms, but if it does not produce the promised flexibility and reusability, then it is a failure.
D.G.: Are there any SOA myths? Can you dispel these?
One of the most common SOA myths is that you can just buy SOA off the shelf. Unfortunately, this myth is often promoted by technology vendors. It is important to recognize that SOA is NOT just a technology solution. Rather, a service oriented architecture (SOA) is a combination of consumers and services that collaborate, are supported by a managed set of capabilities, are guided by principles, and governed by supporting standards.
Having an Enterprise Service Bus (ESB) means you’re doing SOA. The reality is the ESB is nothing more than middleware. I’m sure many people have built applications using an ESB that never realized any of the benefits of SOA. I’m also sure that many of you have examples of a poorly designed database, especially in the early days. Don’t get me wrong, the ESB is a key enabling technology, just like an RDBMS, but must be used to implement a well thought out architecture.
Perhaps the most prodigious myth regarding SOA is that it is synonymous with technology. A service-oriented architecture is not a technology; it is an architectural framework. An architectural framework is a design. Technology is a product or system that achieves a practical purpose. Technology certainly supports the implementation of a service-oriented architecture, but the technology does not constitute the architectural framework itself.
A service-oriented architecture is very much like a “high level” building architecture. It identifies the enterprise layout of applications and interfaces as well as the over-arching plan for technical architecture, information architecture, application architecture, business architecture, and integration architecture.
The tools, technologies, software, hardware, etc. follow the architecture and are similar to the bill of materials for the building plan. They support the architecture directly, but are not part of the architecture.
The important take away here is that the architecture ALWAYS precedes the physical design. You never want the technology to drive the architecture, which happens all too often in IT.
D.G.: For the skeptical business sponsor, what's your SOA elevator pitch? (Assume you're only going 10 floors.)
IT projects are complicated, fraught with risk, always seem to take longer than expected, and seem to cost more than expected. There is no silver bullet to solve these problems; however, we can implement approaches and systems to try to minimize them. SOA is a great example of an approach from IT to address these fundamental challenges with delivering IT solutions. The idea of SOA is to break the business problem down into smaller units of work (lets call them services), and to link them together to create solutions for the business. “Services” allows for reuse and consistent application of business rules. Enabling a new business process by stringing together a new combination of services is a lot easier and takes much less time than writing new applications. Seems pretty simple, and in concept it is. So why has it taken so long? The short answer is that the supporting technology and standards to make this approach feasible broadly, across technology platforms and vendors, has been maturing for some time and has come a long way in the last few years. Now we have the tools and accepted standards to make it a reality.
When selling SOA, remember not to “over sell” the concept. The business may be a little wary of yet another IT initiative. It is important to discuss both the benefits and challenges associated with a new architectural framework.
In selling the business on the benefits of SOA, it is important to address the frustrations of the business and demonstrate how SOA will address their frustrations. It may take some work to identify several key pain points facing the company.
If the business is grumbling that they are unable to adapt quickly to emerging business opportunities, demonstrate how SOA will make the business systems more flexible. If the complaint is that IT costs too much and fails to deliver on time and on budget, explain how SOA will help by easing systems integration issues and why developing reusable components will save money in the long run.
Whatever the problems facing the business, it is important to associate the value of SOA directly to the problems. If IT can clearly demonstrate how SOA will benefit the organization, then it will be an easy sell. However, if IT fails to quantify the benefits of SOA, then this is a strong argument against implementing SOA.
A service oriented architecture is going to decouple your business logic from the underlying technology. Once a portion of your business logic is implemented in such an architecture, you will be able to modify that logic to implement changing processes much more quickly than you’ve experienced in the past. No longer will a seemingly simple change from your perspective require technology changes that ripple through several systems and take months to complete.
You really should consider unlocking all that data (which is one of your company’s best assets by the way) by focusing on the enterprise being more efficient. Yes, you’ve already spent a ton of money on lots of different technology solutions, but they solved specific problems without enough concern for the enterprise value of the information they collect. Why don’t we spend a little time and, of course, money evaluating technology that facilitates sharing, reusability, and standards? Isn’t that what most businesses do to reduce cost and be more efficient?
Source: behavioural Insider
by Phil Leggiere , Wednesday, December 5, 2007
IT'S EASY TO REMEMBER -- but hard to believe -- how esoteric such now-ubiquitous notions as retargeting, behavioral segmentation and profiling seemed only 12 to 24 months ago. In the conversation below, Alistair Goodman, vice-president of strategic marketing at Exponential, outlines a number of key emerging lines of innovation, with radical implications for behavioral and other forms of targeting.
Behavioral Insider: One of the areas you talk about in your 2008 trends predictions is the semantic Web. You say that while this technology has largely lived in the research arena so far, it has finally matured and is now ready to make its way into mainstream applications. How do you see marketers starting to leverage the user data from tools like Twine, Yoono, del.icio.us and Stumbledupon.com?
Alistair Goodman: I'd say of all the top trends for 2008 we identified, the semantic Web is the most 'bleeding edge.' But we think there's a huge opportunity for advertisers to focus technology that's already here to increase their knowledge about how consumers actually interact with content. In our case we use technology on content pages to deduce the true topics of pages on levels far more precise than keywords. We've been focused more and more on how to use page topic knowledge to decide in real time not only which products -- but which ads -- will be the most relevant.
BI: You seem optimistic video may become much more targetable based on user attributes.
Goodman: Yes. With video, I think the frustration stems from the complexity of the buy, as marketers have gotten much more power about how and where their display ads are deployed. What's lacking now are standards and simplicity in serving video. But one major priority of the coming year will be to extend the full gamut of targeting options they already have for banners.
BI: Having local advertisers become able to target national sites is another Exponential trend. What's your road map for that?
Goodman: We're thinking that 2008 in many ways will be the year hyper-local targeting becomes feasible. The real opportunity in local targeting is to allow a local advertiser to run an ad on a national site and reach just those visitors on the site from specific zip code areas. Within specific geographical areas you could then segment by behavior. Of course if you're only talking about three or four zip codes, scale will be an issue. But I can see regional advertisers making great use of something like that. What this opens up also is the ability to dynamically change content based on both behavioral and geo segmentation.
BI: So far one-to-one personalization of creative content has been a very expensive proposition, mostly affordable by only the larger retailers. How and why do you see this expanding in the coming year?
Goodman: We've begun to see one-to-one customization of creative content and product merchandising beyond the large retail space. We have a large insurance firm, for instance, now changing product and content mix based on past habits and current activity of visitors. Utilizing dynamic content controls your customize creative by demographic and behavioral profiles, not just in terms of offers but by aesthetic components that convey different aspects of brand association and image.
BI: Targeting for influence rather than purely transactionally is another goal that's been long talked about. Why do you think this year the effectiveness measurement will move decisively beyond clicks?
Goodman: Increasingly we'll see a greater emphasis on knowing more about who views campaigns but doesn't necessarily click. There's a world beyond click behavior that we're just beginning to truly understand enough to get some leverage. Part of that is learning to understand the effects of a campaign on the particular demographic, psychographic and geographic targets of the campaign. Campaigns could have impacts on awareness, image, interest and many things that influence subsequent behavior but aren't encompassed in a click.
Even more interestingly, we're starting to look at the impact of campaigns on audiences you didn't target. Identifying surprising, unpredictable behaviors you would have missed in the past is as important as leveraging known behaviors. For instance, what if your campaign is geared to males 18-34 in cities, but it turns out the campaign is actually seeing a lift among 35- to 49-year-old males in the suburbs? If you know that in a timely enough fashion, you can shift your focus and direct new offers and approaches based on those insights.
The final thing I'd say looking ahead is that we're now at a point where we can finally begin leveraging all the data assets we have in real time, based on up-to-the-minute information. In the past collecting, storing and interpreting data was a distinct and isolated phase. You crunched and analyzed the data. Then, in a very separate process, you tried to take whatever insights you could glean and develop a creative to translate them into action and execution. Both processes were on different tracks. For the first time, they can now truly begin to converge
Source: Milken Institute
More than half of all Americans suffer from one or more chronic diseases. Despite dramatic improvements in therapies and treatment, the rates of disease have risen dramatically –- and that rising rate is a crucial but frequently ignored contributor to rising medical expenditures.
The human and economic toll of chronic disease on patients' families and society is enormous. Yet while a number of studies have sought to estimate the economic costs of illness, there has not been a significant focus on estimating the costs that could be avoided through efforts to reduce the prevalence and burden of chronic disease. The purpose of this study is to quantify the economic and business costs of chronic disease: the potential impact on employers, the government and the nation's economy. It estimates current and future treatment costs and lost productivity for seven of the most common chronic diseases -- cancer (broken into several types), diabetes, hypertension, stroke, heart disease, pulmonary conditions and mental disorders. Each has been linked to behavioral and/or environmental risk factors that broad-based prevention programs could address.
Among the findings:
More than 109 million Americans report having at least one of the seven diseases, for a total of 162 million cases.
The total impact of these diseases on the economy is $1.3 trillion annually.
Of this amount, lost productivity totals $1.1 trillion per year, while another $277 billion is spent annually on treatment.
On our current path, in 2023 we project a 42 percent increase in cases of the seven chronic diseases.
$4.2 trillion in treatment costs and lost economic output.
Under a more optimistic scenario, assuming modest improvements in preventing and treating disease, we find that in 2023 we could avoid 40 million cases of chronic disease.
We could reduce the economic impact of disease by 27 percent, or $1.1 trillion annually; we could increase the nation's GDP by $905 billion linked to productivity gains; we could also decrease treatment costs by $218 billion per year.
Lower obesity rates alone could produce productivity gains of $254 billion and avoid $60 billion in treatment expenditures per year.
The full report includes a State Chronic Disease Index, as well as tables for each state, looking at disease rates, projections of avoidable costs, and intergenerational impacts.
Tuesday, December 04, 2007
Android™ will deliver a complete set of software for mobile devices: an operating system, middleware and key mobile applications. An early look at the Android Software Development Kit (SDK) is now available.
Android was built from the ground-up to enable developers to create compelling mobile applications that take full advantage of all a handset has to offer. It is built to be truly open. For example, an application could call upon any of the phone's core functionality such as making calls, sending text messages, or using the camera, allowing developers to create richer and more cohesive experiences for users. Android is built on the open Linux Kernel. Furthermore, it utilizes a custom virtual machine that has been designed to optimize memory and hardware resources in a mobile environment. Android will be open source; it can be liberally extended to incorporate new cutting edge technologies as they emerge. The platform will continue to evolve as the developer community works together to build innovative mobile applications.
All applications are created equal
Android does not differentiate between the phone's core applications and third-party applications. They can all be built to have equal access to a phone's capabilities providing users with a broad spectrum of applications and services. With devices built on the Android Platform, users will be able to fully tailor the phone to their interests. They can swap out the phone's homescreen, the style of the dialer, or any of the applications. They can even instruct their phones to use their favorite photo viewing application to handle the viewing of all photos.
Breaking down application boundaries
Android breaks down the barriers to building new and innovative applications. For example, a developer can combine information from the web with data on an individual's mobile phone -- such as the user's contacts, calendar, or geographic location -- to provide a more relevant user experience. With Android, a developer could build an application that enables users to view the location of their friends and be alerted when they are in the vicinity giving them a chance to connect.
Fast & easy application development
Android provides access to a wide range of useful libraries and tools that can be used to build rich applications. For example, Android enables developers to obtain the location of the device, and allows devices to communicate with one another enabling rich peer-to-peer social applications. In addition, Android includes a full set of tools that have been built from the ground up alongside the platform providing developers with high productivity and deep insight into their applications.
To know how you'll be using computers and the Internet in the coming years, it's instructive to consider the Google employee: most of his software and data--from pictures and videos, to presentations and e-mails--reside on the Web. This makes the digital stuff that's valuable to him equally accessible from his home computer, a public Internet café, or a Web-enabled phone. It also makes damage to a hard drive less important. Recently, Sam Schillace, the engineering director in charge of collaborate Web applications at Google, needed to reformat a defunct hard drive from a computer that he used for at least six hours a day. Reformatting, which completely erases all the data from a hard drive, would cause most people to panic, but it didn't bother Schillace. "There was nothing on it I cared about" that wasn't accessible on the Web, he says.
Schillace's digital life, for the most part, exists on the Internet; he practices what is considered by many technology experts to be cloud computing. Google already lets people port some of their personal data to the Internet and use its Web-based software. Google Calendar organizes events, Picasa stores pictures, YouTube holds videos, Gmail stores e-mails, and Google Docs houses documents, spreadsheets, and presentations. But according to a Wall Street Journal story, the company is expected to do more than offer scattered puffs of cloud computing: it will launch a service next year that will let people store the contents of entire hard drives online. Google doesn't acknowledge the existence of such a service. In an official statement, the company says, "Storage is an important component of making Web apps fit easily into consumers' and business users' lives ... We're always listening to our users and looking for ways to update and improve our Web applications, including storage options, but we don't have anything to announce right now." Even so, many people in the industry believe that Google will pull together its disparate cloud-computing offerings under a larger umbrella service, and people are eager to understand the consequences of such a project.
To be sure, Google isn't the only company invested in online storage and cloud computing. There are other services today that offer a significant amount of space and software in the cloud. Amazon's Simple Storage Service, for instance, offers unlimited and inexpensive online storage ($0.15 per gigabyte per month). AOL provides a service called Xdrive with a capacity of 50 gigabytes for $9.95 per month (the first five gigabytes are free). And Microsoft offers Windows Live SkyDrive, currently in beta, with a one-gigabyte free storage limit.
But Google is better positioned than most to push cloud computing into the mainstream, says Thomas Vander Wal, founder of Infocloud Solutions, a cloud-computing consultancy. First, millions of people already use Google's online services and store data on its servers through its software. Second, Vander Wal says that the culture at Google enables his team to more easily tie together the pieces of cloud computing that today might seem a little scattered. He notes that Yahoo, Microsoft, and Apple are also sitting atop huge stacks of people's personal information and a number of online applications, but there are barriers within each organization that could slow down the process of integrating these pieces. "It could be," says Vander Wal, "that Google pushes the edges again where everybody else has been stuck for a while."
One of the places where Google, in particular, could have a large impact is integrating cloud computing into mobile devices, says Vander Wal. The company recently announced Android, a platform that allows people to build software for a variety of mobile phones. The alliance could spur the creation of mobile applications geared toward cloud computing, he says. People want to seamlessly move their data between computers, the Web, and phones, Vander Wal adds. "If Google is starting to solve that piece of the problem, it could have an impact because that's something no one's been able to do yet."
SAN FRANCISCO (AP) -- Seeking to keep the peace in its popular online hangout, Facebook Inc. has overhauled a new advertising system that sparked privacy complaints by turning its users into marketing tools for other companies.
Under the changes outlined late Thursday, Facebook's 55 million users will be given greater control over whether they want to participate in a three-week-old program that circulates potentially sensitive information about their online purchases and other activities.
Facebook provided two different opportunities to block the details from being shared, but many users said they never saw the ''opt-out'' notices before they disappeared from the screen.
With the reforms, Facebook promised its users will now have to give their explicit consent, or ''opt-in,'' before any information is passed along.
The concessions were made after more than 50,000 Facebook users signed an online petition blasting the system, called ''Beacon,'' as a galling intrusion that put the Palo Alto-based startup's pursuit of profit ahead of its members' privacy interests.
More than 40 different Web sites, including Fandango.com, Overstock.com and Blockbuster.com, had embedded Beacon in their pages to track transactions made by Facebook users.
Unless instructed otherwise, the participating sites alerted Facebook, which then notified a user's friends within the social network about items that had been bought or products that had been reviewed.
Facebook thought the marketing feeds would help its users keep their friends better informed about their interests while also serving as ''trusted referrals'' that would help drive more sales to the sites using the Beacon system.
But thousands of Facebook users viewed the Beacon referrals as a betrayal of trust. Critics blasted the advertising tool as an unwelcome nuisance with flimsy privacy protections that had already exasperated and embarrassed some users.
Some users have already complained about inadvertently finding out about gifts bought for them for Christmas and Hanukkah after Beacon shared information from Overstock.com. Other users say they were unnerved when they discovered their friends had found out what movies they were watching through purchases made on Fandango.
If Facebook adheres to the new ''opt-in'' standard, ''it would be a significant step in the right direction,'' said Adam Green, a spokesman for MoveOn.org, which launched the petition drive to revamp Beacon just nine days ago. ''It also says a lot about the ability of Internet user to band together to make a difference.''
The backlash against Beacon illustrated the delicate balancing act that Facebook must negotiate as the company tries to cash in on its popularity without alienating the users fueling its success.
Beacon is a key component in Facebook's ''Social Ads'' program, which is vying to make more money from the rapidly growing audience that uses the social network's free services as a place to flirt, gossip and share personal passions.
Privately held Facebook already is believed to generate more than $150 million (euro102 million) in annual revenue after just three years in business, but it's under pressure to accelerate its growth.
Microsoft Corp. raised the stakes last month by paying $240 million (euro163 million) for a 1.6 percent stake. The investment valued Facebook at $15 billion (euro10 billion) -- an assessment that will require the company to become a lot more profitable in the next few years.
Skeptics have questioned Facebook's market value, given the company's brief existence and the inexperience of its 23-year-old chief executive, Mark Zuckerberg, who started the social network in 2004 while he was still a Harvard University student.
This isn't the first time that Facebook has done an about-face after introducing a feature that raised privacy concerns. Last year, Facebook rolled out a ''news feeds'' tool that tracked changes to users' profiles. After thousands of users rebelled, Zuckerberg issued a contrite apology and added a way to turn off the news feeds.
This time around, a customer support representative expressed Facebook's regrets in a Wednesday night note that foreshadowed the changes made Thursday.
''We're sorry if we spoiled some of your holiday gift-giving plans,'' Facebook's Paul Janzer wrote in a posting addressed to Beacon's critics. ''We are really trying to provide you with new meaningful ways, like Beacon, to help you connect and share information with your friends.'' Janzer also acknowledged Beacon ''can be kind of confusing.''
Zuckerberg, whose stake in Facebook is worth $3 billion (euro2 billion), thought Beacon's referral system would be seen as friendly product endorsements that generated more sales than traditional advertising. He hailed the distribution of peer recommendations as advertising's ''holy grail'' when Beacon was introduced earlier this month.
But Beacon may lose some of its luster with the tougher privacy controls. That's because fewer people typically participate in services with opt-in provisions.
Authenticity over Exaggeration:
The New Rule in Advertising
Published: December 3, 2007
Author: Julia Hanna
Source HBR Working Knowledge
Imagine the glee of marketers at the dawn of the Internet era—could anyone imagine a more sophisticated, precise way of reaching consumers? By tracking the purchasing habits of its prey, marketers could respond with targeted advertising and special offers, resulting in (of course) increased sales.
The past 10 years have seen some level of this direct marketing model bear out. But according to an HBS working paper to be published in the Journal of Interactive Marketing, consumers are using technology to learn about marketers, rather than the other way around.
While product consumers use sites such as eBay, YouTube, and Facebook to gather information and share opinions on how they spend their money, an entirely new marketing philosophy is called for, one in which the marketer no longer controls the message.
In "Digital Interactivity: Unanticipated Consequences for Markets, Marketing, and Consumers," HBS professor John Deighton and Leora Kornfeld, research director of Canada's Mobile MUSE Consortium, pinpoint 5 qualities of success in this new world of digital media marketing.
In this new reality, it's the consumer who runs the show for the most part, not the marketer—in fact, forget the "consumer" label altogether.
It's too limiting.
Deighton cites Dove's "Real Beauty" campaign, a multiphase effort with an underlying theme that subverts traditional beauty product messages of aspiration and perfection. In one ad, full-sized, regular-looking women are used. In another, young girls reveal insecurities about their looks, showing the harm done by unrealistic standards set by the industry. (Dove is also the subject of a new case by Deighton.)
"Authenticity becomes a much more desirable property than exaggeration.""The story of Dove is one of a brand that progressively cedes control," Deighton says. "In the 1950s, Dove's advertising approach was similar to a World War II military campaign with a heavy bombardment of 30- and 60-second messages with very strong, functional content. It was all delivered with complete control over the message and the media."
Word to the media wise
That sort of approach isn't possible in today's media-rich world—and probably wouldn't be very effective anyway.
"It's more like the Vietnam War now," Deighton continues. "The ideas have to belong to the people you're attempting to engage with, and that's going to be achieved through indirect methods rather than by going directly at the enemy." Instead of overwhelming consumers with a message, get them talking by presenting a topic they want to discuss. Then stand back and cross your fingers.
"When a brand adopts a point of view, rather than simply making a claim for softer skin, for instance, it can become a lightning rod for discourse," Deighton remarks. "You have to be confident that your message can withstand reinterpretation."
The Dove ads, for example, have been parodied on late-night television, although that level of exposure hasn't bothered Unilever, Dove's parent company. "An executive there told me that you can't buy this kind of publicity," says Deighton.
The new rules
But what does this all boil down to for companies that want to be successful in this relatively new environment? In the working paper, Deighton and Kornfeld discuss 5 aspects of digital interactivity, including
1. Thought tracing. Firms infer states of mind from the content of a Web search and serve up relevant advertising; a market born of search terms develops.
2. Ubiquitous connectivity. As people become increasingly "plugged in" through cell phones and other devices, marketing opportunities become more frequent as well—and technology develops to protect users from unwanted intrusions. A market in access and identity results.
3.Property exchanges. As with Napster, Craigslist, and eBay, people participate in the anonymous exchange of goods and services. Firms compete with these exchanges, and a market in service, reputation, and reliability develops.
4. Social exchanges. People build identities in virtual communities like Korea's Cyworld (90 percent of Koreans in their 20s are members). Firms may then sponsor or co-opt communities. A market in community develops that competes on functionality and status.
5. Cultural exchanges. While advertising has always been part of popular culture, technology has increased the rate of exchange and competition for buzz. In addition to Dove's campaign, Deighton cites BMW's initiative to hire Hollywood directors and actors to create short, Web-only films featuring BMWs. In the summer of 2001, the company recorded 9 million downloads.
These 5 aspects show increasing levels of effective engagement in creating social meaning and identity, Deighton suggests, noting that the first 2 (thought tracing and ubiquitous connectivity) change the rules of marketing but don't alter the traditional paradigm of predator and prey. In the last 3 (property, social, and cultural exchanges), the marketer has to become someone who is invited into the exchange or is even pursued (as in the case of the BMW films) as an entity possessing cultural capital.
So what's the best course of action for marketers faced with this complex new world of meaning-making? Deighton challenges his students in HBS's executive Owner/President Management Program to think of a witty, self-aware ad that they could create for their business for the price of a handheld camera.
Admittedly, this is no easy feat when you run a scrap metal dealership. But it can be done. One popular video on YouTube, "A Big Ad," features 3 young men parodying a grand-scale, cast of thousands Carlton Draught beer ad for a small local dairy.
Deighton also cites a former Swiss student now working for a pharmacy lab who finds young, classically trained musicians; records their work; and distributes the CDs to customers.
"He blends the purity of the artist and a sense of discovery with his business," he says. "It speaks to a certain authenticity, which in this world becomes a much more desirable property than exaggeration."
And as digital interactivity increases the contexts in which people use new media, it becomes less and less productive to think of people as consumers alone.
"If a company limits its engagement to the part of the person's life when he or she is thinking about skin care, for example, it diminishes that person and marginalizes the brand," Deighton says. "I think the central idea here is that in the future, brands will be more talked about than talking."
Monday, December 03, 2007
In just the last two years, consumers have been awash in new digital platforms, from social networking to Web applications, mobile connectivity to video-on-demand. But how are users really embracing these formats? Is the technology driving new kinds of online behavior that all marketers need to watch more carefully?
For example, a recent Advertising.com survey of online video usage showed a radical shift in usage patterns just between the last six months of 2006 and the first half of 2007. In the coveted 18- to 34-year-old young adult demo, 51% of connected consumers accessed TV episodes online, compared to 33% six months prior. Likewise the audience for movie trailers rocketed from 19% to 32%, and user-generated video from 28% to 42%. Historically, one of the last times we saw people respond to an emerging platform this quickly was with TV itself, which achieved a 50%+ reach in the U.S. in less than five years in the early 1950s.
Avenue A/ Razorfish Digital recently published a "Digital Consumer Behavior Study," which queried consumer use of new platforms with equally striking results. Onliners are responding very quickly to new technologies, and it is having a demonstrable effect on the media they consume. The uptake of RSS (real simple syndication) has been accelerated substantially by Yahoo and Google personalized home pages, which make it easy to drag and drop new content feeds onto a start-up page without the user even knowing they are using RSS. In its sample of 475 respondents, Avenue A found that 56% subscribe to RSS. Only a year or two ago, most industry sources pegged RSS usage at well below 20%. Like online video, the new platform has taken a fast track with users. The survey shows that 52% of user find RSS useful "most" (38%) or "all" (14%) of the time.
The net effect of having so many information outlets available in a single place has been a remarkable fragmentation of media consumption into personalized collections of niche programming. Not only have about 60% of users created customized home pages of feeds and widgets, but 70% read blogs regularly and 46% read four to five blogs. A staggering 67% of Web surfers are watching video on YouTube and other video aggregators. It is hard to believe that such high levels of personalized media aggregation are not cutting substantially into the mindshare consumers devote to major media. In fact, the survey found 91% of connected consumers agreeing that they rely on the Web for current news and information more than television.
The days of mass retailing may be numbered as well. Just as the taste for media content is diffusing quickly, so has the top end of the purchase funnel widened. A clear majority of online product researchers, (54%) start with general search, while only 30% go to retailer Web sites or even e-commerce specialists like Amazon. Only 17% say they actively seek out online retailers that have brick and mortar presence as well. Just as striking is the power of word of mouth online, with 61% reporting that they have made a purchase decision based on a user-generated review or product recommendation. The media eco-system that supports the product buying process is also under pressure by media fragmentation. Interactivity doesn't just decentralize media but also diffuses authority. When researching online, 55% say they consult user reviews vs. 21% who rely most on expert reviews. The age of the predictable and well-defined purchase funnel could be over.
Despite accelerated adoption of certain Web 2.0 platforms, consumers are keeping some technologies at arm's length for now. Tag clouds, for instance, attract only 12% of surfers with any regularity, and 65% say they never use them. And for all the hype and attention spent on mobile, 64% never use their phone to check news, sports or stock headlines.
While conventional wisdom suggests that technology transforms culture and behaviors, history demonstrates just the opposite. Despite immediate fascination with motion pictures, it took film and their middle class audiences almost two decades to find one another in the early part of this century. Television was technically feasible by 1937 (CBS had an experimental programming studio in the '30s). But the medium did not match the cultural moment until the post-WWII age of mass consumption, suburban lifestyles and nuclear families. Some technologies like the video phone (World's Fair, 1964) never moved our habits an inch even in decades of false starts.
Likewise, the adoption of Web 2.0 technologies in the last few years is impressive but still selective. We embrace those aspects of social networking, recommendation engines, and fragmented media distribution that fit the cultural moment and enable other tendencies in the culture. Behavior and technology change one another. In this case, the trajectory clearly aims towards personalization, both of media production and consumption.
Digital behavior is showing what futurists have been telling us for several years. Mass media is history. Turn the page to the next chapter.
By Kevin Allison in San Francisco and Andrew Edgecliffe-Johnson in London
Published: December 3 2007 Source FT.com
Dell plans to consolidate its worldwide marketing, advertising and communications operations into a single new public relations unit led by WPP, the UK advertising group.
The advertising pact, worth $4.5bn in billings over the next three years, will come as a significant fillip for WPP as Sir Martin Sorrell, its chief executive, has been trying to persuade investors that they are undervaluing the company’s share price. WPP’s stock price has fallen 18 per cent in the past six months, underperforming the London market.
Analysis: WPP leads the field in US power game - Nov-26Dell shares fall 10% on poor results - Nov-30In an analyst’s note in September, John Janedis of Wachovia Capital Markets said the Dell account could be worth as much as $90m to $100m of annual revenues to the winner, Bloomberg reported.
Mark Jarvis, Dell’s chief marketing officer, said WPP would set up a separate company to deal exclusively with the computer maker.
Dell, the world’s second-biggest personal computer maker, said the new arrangement would simplify a sprawling PR operation spread across as many as 800 separate companies.
In a post on a company blog, Mr Jarvis said: “By combining our agencies, we can invest in the long-term, in the people and tools to unlock far greater results than a patchwork quilt.
“We believe this is the first time a global client and agency have come together to redefine the ‘agency’ on such a scale.”
The push to consolidate Dell’s marketing, communications and advertising activities comes as the PC maker is trying to raise the profile of its brand amid fierce competition from Hewlett-Packard, its biggest rival.
HP has continued to take PC market share from Dell in recent quarters, in spite of an aggressive turnround effort launched this year by Michael Dell, chairman and chief executive.
Casey Jones, Dell’s vice-president of global marketing, said a top priority of the new WPP group would be to support the computer maker’s push into retail stores.
Dell this year ended more than two decades of near-exclusive reliance on direct sales of computers over the telephone and internet by striking sales deals with several retailers, including Wal-Mart.
Mr Jones said: “After seven months of struggling with a patchwork quilt of 800 different shops, we’ve got a large partner helping us build the resources necessary to the retail channel”.
The Dell deal may bolster WPP’s argument that its full-service approach can counter Google’s growing weight in online advertising and the search engine’s incursions into print and broadcast advertising.
Speaking at a conference in Barcelona last month, Sir Martin said the internet search group had ambitions to develop client relationships similar to those of big advertisers such as WPP.
“That’s where the next battleground will be,” Sir Martin said, adding that Google was a “short-term friend and a long-term enemy”.
Friday, November 30, 2007
The thing that makes computers a huge pain for everybody, says Pedro Domingos, an associate professor of computer science at the University of Washington, is that you have to explain to them every little detail of what they need to do. "It's really annoying," Domingos jokes. "They're stupid."
That's why Domingos is taking part in CALO, a massive, four-year-old artificial-intelligence project to help computers understand the intentions of their human users. Funded by the Defense Advanced Research Projects Agency (DARPA), and coordinated by SRI International, based in Menlo Park, CA, the project brings together researchers from 25 universities and corporations, in many areas of artificial intelligence, including machine learning, natural-language processing, and Semantic Web technologies. Each group works on pieces of CALO, which stands for "cognitive assistant that learns and organizes."
Adam Cheyer, program director of the artificial-intelligence center at SRI, explains that CALO tries to assist users in three ways: by helping them manage information about key people and projects, by understanding and organizing information from meetings, and by learning and automating routine tasks. For example, CALO can learn about the people and projects that are important to a user's work life by paying attention to e-mail patterns. It can then categorize and prioritize information for the user, based on the source of the information and the projects to which it is connected. The system can also apply this type of understanding to meetings, using its speech-recognition system to make a transcription of what's said there, and its understanding of the user's projects and contacts to process the transcription intelligently into to-do lists and appointments. Finally, a user can teach CALO routine tasks such as purchasing books online and searching for bed-and-breakfasts that meet specific criteria. CALO can interact with other people, taking on tasks such as scheduling meetings, coordinating among people's schedules, and making decisions, such as deciding to reschedule a meeting if a key member becomes unable to attend.
"It's an amazingly large thing, and it's insanely ambitious," Domingos says. "But if CALO succeeds, it'll be quite a revolution. Even if it doesn't, so much good research is happening under it that it will still have been worthwhile."
The goal is to build an artificial intelligence that can serve as a personal assistant for someone--not something with a rigid structure within which it can be helpful, like the animated paper-clip assistant featured in Microsoft Office products, but a system that can learn about a user's environment and needs, and adapt to them, without having to be programmed anew by engineers. "What's different and has never been done before in this way is the truly integrated approach of bringing all of these technologies and all of these capabilities into a single system," says Cheyer. "It takes a system of this size to give you something that can understand and organize so much information."
The project might seem broad in its goals, but the researchers believe that ultimately, the system will benefit from multiple technologies working together. Consider the meeting-transcription function, says William Mark, vice president of the information and computer-science division at SRI. Even the best speech-recognition systems would have trouble producing an accurate transcript of a meeting unassisted, he says, but "in our context, because of information management, CALO has deep and rich knowledge about who are the people in the room, and what are the documents and phrases and slang used in context."
Since CALO has many learning systems, one challenge is integrating them so that CALO has a consistent structure for information that it can use to make decisions based on the noisy, uncertain data that it extracts from its various interactions. Domingos and others have been working on a probability consistency engine, which unifies two traditional approaches to artificial intelligence: logic and probability.
Alan Qi, an assistant professor of computer science at Purdue University, who is not involved with CALO, says that the unification of logic and probability is an important endeavor for the field of artificial intelligence. Combining these two approaches, Qi says, is far better than using either alone. Probabilistic approaches can handle noise and uncertainty well, while a logical structure is best for handling meaning.
Although CALO's approach is very far-reaching, SRI has made a version, called CALO Express, that boils down some of the features of CALO that are almost ready for deployment. CALO Express is a lightweight version of the real deal that integrates with Microsoft products such as Outlook and PowerPoint. Cheyer says that it includes parts of the three main features of information management, meeting assistance, and task management. He says that CALO Express is now being evaluated for use at DARPA. While it's uncertain whether CALO Express will become a commercial product available outside of the military, there is still hope that the average person may get access to technologies of this type. The research has already produced a few products, such as Smart Desktop, which is an information-management system that spun off of the task-tracer project done by Oregon State University as part of CALO. Radar Networks, makers of the Semantic Web product Twine, has also worked on some of CALO's semantic underpinnings. (See "The Semantic Web Goes Mainstream.")
After nearly a decade of working on microelectromechanical systems (MEMS) for medical implants, a startup based in Bedford, MA, called MicroChips has prototypes for its first commercial products. By the beginning of January, the company plans to start animal trials of a device for healing bones damaged by osteoporosis. In a year and a half, it hopes to begin human trials on an implant for monitoring glucose levels in diabetics.
The first product, a device for delivering an anti-osteoporosis drug automatically, could allow patients to replace 500 daily injections with a single outpatient implant procedure. The glucose sensor, by continuously monitoring glucose levels, could reveal spikes in blood-sugar levels that go undetected using conventional sensors. Such spikes, if not treated, can contribute to organ damage, including blindness.
While a graduate student at MIT in the mid 1990s, John Santini, now MicroChips' CEO, began to build devices that could release chemicals from any of dozens of reservoirs carved out of silicon and glass using conventional lithography techniques. (See "TR 35, John Santini" and "Implantable Medication.") But transforming this technology into a commercially viable product has proved to be a long process.
Santini and his team at MicroChips faced a number of hurdles. For example, to protect the drugs and sensors until they were released or exposed, it was necessary to hermetically seal all of the electronics, including radios for controlling the device, along with the MEMS reservoirs and caps. The trick was that conventional welding required temperatures that would damage the enzymes and peptides that were to be stored in the wells. So the company developed a cold-compression technique that forms an unbreakable bond.
The remaining challenges had to do with finding particular applications that needed the MEMS array technologies. The company discovered that its technology could provide an answer to a "vexing problem" that had stymied researchers at Medtronic, a multibillion-dollar implant manufacturer, for years, says Stephen Oesterle, Medtronic's senior vice president for technology and a MicroChips board member. A sensor that monitored glucose continuously hadn't been possible, he says, because sensors degrade over time, lasting for at most a month. With the MEMS technology, however, it became possible to implant an array of sensors, activating just one at a time. The sensor is read by onboard electronics, with the data transmitted via radio to an external monitor.
Source MIT Technology Review
An MIT spinoff is finally ready to begin testing smart implants for drug delivery and sensing.
In the case of osteoporosis treatment, the technology offered a way to deliver a medicine that normally requires daily injections. Parathyroid hormone is the only osteoporosis drug approved that can help the body to repair damage caused by osteoporosis, rather than just stop the damage, Santini says. The problem is that the drug cannot be delivered gradually, in a conventional control-release delivery system. That's because if it remains in the body continuously, it will promote the degradation of bone, making things worse rather than better. As a result, the drug had to be administered via daily injections. MicroChips technology offered a way to deliver quick bursts of the drug automatically. The chip can be programmed to release the drug at regular intervals. It can also be controlled via radio signals.
These two applications are only the beginning of potential uses for the technology. It could be used with a number of different sensors, providing early warning of heart attacks and strokes. It could also be used to deliver multiple drugs in response to signals from integrated sensors. For diabetes, the glucose sensor could be paired with insulin pumps. Such a system, Oesterle says, would more closely mimic the body's mechanisms for regulating blood sugar.
One in four promotional Web sites operated by financial services firms fail to present information in a "fair, clear and not misleading way", says the UK's Financial Services Authority (FSA) following a review of advertising on investment sites.
The year-long review involved 130 visits to 77 firms' Web sites. Each visit to a Web site replicated the type of journey a customer may be making through the service, says the FSA.
The study found that three quarters of sites met the standards required, but one-in-four were difficult for consumers to navigate and failed to highlight key information.
This is partly because firms are not placing enough emphasis on the customer journey and general Web site design when placing key information, says the watchdog. In some instances general Web site maintenance was also lacking, resulting in out-of-date or incorrect information being provided to consumers.
The FSA did not name the firms operating the sites that fall short, but says it will carry out another review in March 2008 and will take action if it finds further failings.
Dan Waters, Director of Retail Policy and Themes at the FSA, says for many customers the Internet is the channel of choice for shopping around for financial products, but it can expose consumers to high risk as they are able to make instant purchases without advice.
"We expect the senior management of all regulated firms to ensure their customers are treated fairly - and we will be looking at promotional Web sites again early next year to make sure that firms have taken our findings on board and are taking Web site design seriously," adds Waters.
But the watchdog says standards have improved since it stepped up its supervision of Internet-based promotions following reviews in 2005 and 2006, which identified widespread failings.
Source: Online Spin November 15th, 2007 by Tim Vanderhook
The online advertising industry has recently seen an unprecedented flurry of acquisitions, with Google’s acquisition of DoubleClick for $3.1 billion on April 14 setting off a domino effect among its rivals. Yahoo followed on April 30 with its acquisition of remaining Right Media shares; on May 17 WPP bought 24/7 Real Media. And just a day later — the biggest of all — Microsoft acquired aQuantive for $6 billion. These four big deals were followed in July by AOL’s acquisition of TACODA and Microsoft’s acquisition of AdECN. The rapid consolidation highlighted the market’s close-to limitless inventory and steadily growing numbers of eyeballs, as well as further propelled online advertising on its way to becoming a mature industry. At this moment, however, with the consolidation still far from finished and the industry in flux, many — both inside and outside the industry — are wondering what will happen next.
Display Advertising Battle
The recent acquisitions clearly show that display ads will be the next phase of growth in the online advertising industry, with more and more budget going into them. Google has probably been studying over the past year or so the impact display advertising has on search advertising. The two are tied closely together, with display ads fueling search ads for advertisers. While previously all the credit went to the search engines for producing a great ROI, advertisers now recognize that banner ads are the driving force behind users’ searches, and Google is simply the final conduit to get to a specific site.
Because Google has such a dominant position in search advertising, it is moving quickly to make the weakest part of its business — display advertising — a strength. Google benefited a great deal from the DoubleClick acquisition: it gained access to greater user data, specifically to the types of content a user reads when outside of Google.com, as well as which ads a user finds of interest. Acquiring what many considered the marquis brand in ad serving bought it credibility in the display advertising market. The Google-DoubleClick deal is not so much about purchasing a new method of targeting as it is about acquiring assets to help its display advertising business. Google stands to lose a lot if search is looked at less favorably in the future and has to move now to protect its position.
AOL and Microsoft, instead of trying to catch up to Google’s dominance in the text ad area, are doing the right thing in staying ahead of the curve and strengthening their lead in display ads. The acquisition of TACODA further appears to strengthen AOL’s leadership position instead of attempting to compete in Google’s staple category of text ad inventory. Historically AOL via Advertising.com has been the leader in monetizing display ads on publisher Web sites, and the company knows that the DoubleClick acquisition is not plug-and-play for Google like Advertising.com was originally for AOL. While Google made a good attempt to get into the market by purchasing an ad serving technology, it still needs to forge relationships with major publishers on the display side of the business to get inventory to resell. This strategy will prove to be an uphill battle — and for $3.1 billion in cash, that hill just got a lot bigger.
The Bottom Line
While Google certainly seems to be the 800-pound gorilla of online advertising these days, Yahoo, AOL and Microsoft are not yet out of the game. In spite of the rapid consolidation of the market, neither are smaller, more nimble players. As AOL’s recent announcement of integrated Platform A (for media and technologies across all of AOL’s current ad networks) shows, advertisers will increasingly realize that they cannot favor one type of ad or targeting over another, particularly as consumers are becoming increasingly sophisticated in their online behavior. Understanding the online consumer’s mindset better will allow advertisers to choose the most appropriate type of targeting for their campaigns, often opting for a mixture of methods and ad networks.
The biggest winners in this year of acquisitions will be consumers — receiving more relevant ads, at more appropriate times. Experience shows consumers don’t necessarily like advertising but are willing to endure it for free content. If the ads they see are relevant to who they are, what they are interested in, and where they are located, it becomes a win-win for all parties in the chain.
Thursday, November 29, 2007
Omnicom Group Inc.’s Diversified Agency Services division has bought Expert Communications Inc., a San Francisco direct marketing agency that specializes in serving clients in financial services, healthcare, telecommunications and utilities.
Expert Communications will operate as part of Omnicom’s Star Marketing Group, which includes customer relationship marketing-focused agencies including Javelin Direct, Innovyx, Optima and Critical Mass. Terms of the deal were not disclosed.
“Their deep expertise in CRM specifically and their ability to convey complex and highly regulated communications in financial services and technology as well as healthcare” attracted Omnicom, said Ed McNally, president/CEO of Star Marketing Group.
Expert Communications will keep its name and retain its staff of 60. Bara Oscodar and Colleen Cramsie founded the agency in 1995. The shop offers creative, database marketing, production, modeling, analysis, data warehousing and account services.
Senior Editor in Charge of Adotas, Sarah Novotny summarizes key future trends in Online advertising as predicted by Dilip DaSilva, Exponential's founder and CEO. Excerpting from the summary, Novotny says there are six key trends identified in this release.
1. In a world where nearly everything can be anonymously known about a registered user on a large site, marketers are making full use of the data to deliver on the promise of 1-1 marketing. Online advertisers wil create highly customized and immersive marketing experiences, fully leveraging the most up-to-date data from their databases.
2. While user generated content (UGC) is inappropriate for most major brands, we'll see more professionally-produced viral campaigns that capitalize on this genre, says DaSilva. With no standards, and lots of different players and technology, video buys are still incredibly labor intensive, and often require site-by-site exercise he says. Brand advertisers are clamoring for an affordable, brand-safe environment with the ability to target audiences with video.
Premium content is very expensive, and serious questions are emerging about the level of intrusiveness of running pre-roll. Professionally produced content affords a better value but is more difficult to find at scale. Overlay advertising will likely become the preferred solution, DaSilva anticipates, when it comes to monetizing the large volume of user generated video content, whereas pre-roll will continue to be tolerated in front of high-quality content.
3. 2008 is the year when we finally see a viable, truly local solution for local advertisers. The breakout leader in the online local space will be the company that provides the best user experience and repeat users, "without consumers being bombarded with national ads when they are looking for a local sushi restaurant." says the release.
We believe, reports DaSilva, that "vertical local" will play a vital role, becoming even more granular, or "hyper local," with legal, travel, and home services playing an important role as well. The release notes that Yellowpages.com and Superpages tend to attract the users who are searching for more services.
"Local CPC (cost-per-click) display" will evolve to be a major factor in 2008. This means local users on a national site will be shown display ads of local businesses in their area that are relevant to their interests, using data from the display network. These ads create a greater CTR and also the opportunity for local businesses to advertise on sites that they could not have accessed in the past.
4. As more advertisers launch online branding campaigns they increasingly want more metric feedback on the effectiveness of their online marketing efforts, beyond clicks, both to justify spending on the web and to help guide their future media allocations.
With the shift of TV to online, the industry has no choice but to embrace the emotional selling proposition where the emotional values becomes critically important in the buying process. This change will enable online buying to shift from a transactional strategy to an ongoing source for influencing the customer in different stages of the purchase funnel.
5. While the technology of computers acting as intelligent agents has largely lived in the research arena, it is now making its way into mainstream applications. (Some Online marketers) are now deploying applications that "learn," enabling users to structure and share the richness of their online experiences.
6. 2008 will see the increased popularity of Virtual Worlds. Users will increasingly shift towards specialty worlds more closely associated with their lifestyles or interests. This will be an opportunity for marketers to create whole worlds around products, or to customize environments inside specialty virtual environments.
Advertisers can create branded environments and even brand accessories. For example, Coke Studios is an online community with millions of users creating customized music mixes that can be shared and rated by others.
DaSilva concludes that "These are a few of the key topics we think will be consistently discussed during the conferences and industry events in 2008..."
Time magazine last week had a special issue devoted to the second part of this philosophy called "America By The Numbers." This issue was dedicated to examining how our consumer landscape is changing and what behaviors are exhibited by the modern American. The issue was fascinating, and I wanted to share with you some of the information which was included and what I feel the impact of this information will be on digital marketers.
. There are 303 million people in the United States. Only 10% of U.S. households have more than 5 people, versus 21% in 1970. This means that more people are venturing away from home and the "traditional family unit" is changing now that the Baby Boomers have kids that have "left the roost." For media, that means the television and other media habits change to reflect more of individual interests rather than those of the family unit as a whole.
· Approximately 40% of Americans are "on the move" during the two-hour "rush hour," which is comprised of people driving to work, taking the bus, taxis, motorcycles, carpools and children riding buses to school. 107 million people drive to work alone. That means there is a lot of time for accessing time-shifted and location-shifted media via iPods, iPhones, PSPs and laptops.
· The median household income in 2005 was $48,201, but appears to be growing slightly. The problem is most people in America don't feel it. The rich are getting richer faster than the poor are getting poorer, which skews the data to show an overall increase. As a result, luxury goods and higher ticket items can be impacted, as there are a fewer people making a larger position of the wealth, and therefore a smaller target audience to speak to.
· People in advertising sales appear to be among the happiest one-third of employed Americans; with the unhappiest people by job being gas station attendants (they only make approximately $17,750 per year).
· In the U.S., 1% of the total population makes more than $350,501 per year, while 0.01% earns more than $9.6 million per year. What is disturbing about these facts is that the 0.01% with the most income accounts for approximately 5.1% of all U.S. income.
· The average American sees five to eight movies per year. The average U.S. household has more TVs than people (2.73 TVs versus 2.6 people per household). This means that, try as we might TV, is still king, and "sight, sound and motion" still sells!
· 71% of U.S. households have an Internet connection, translating to about 31 million homes without access to the Internet. 42% of Americans say they go online and work from home. That means the lines between work and home are continuing to blur, as recently examined when results were published about consumers' work habits and the penetration of BlackBerrys and similar devices.
· The average American goes online 2.1 times at work and 1.2 times at home each day. He/she visits an average of 6 domains per day and spends an average of 3 hours and 43 minutes online between work and home. That means that although the Internet is large, our personal realms are small.
· Americans spend an average of $390 billion/year in restaurants versus $364 billion in the grocery store. That means... that we like to eat out a lot.
Tuesday, November 27, 2007
Preliminary estimates from the Newspaper Association of America show that advertising expenditures for newspaper Web sites increased by 21.1 percent to $773 million in the third quarter versus the same period a year ago. This is the fourteenth consecutive quarter of double digit growth for online newspaper advertising since 2004. The continued year-over-year gains have demonstrated the importance of newspaper Web site advertising, which now accounts for 7.1 percent of total newspaper ad spending, compared to 5.4 percent in last year's third quarter.
Total advertising expenditures at newspaper companies were $10.9 billion for the third quarter of 2007, a 7.4 percent decrease from the same period a year earlier. Spending for print ads in newspapers totaled $10.1 billion, down nine percent versus the same period a year earlier.
In the third quarter:
Classified advertising fell 17 percent to $3.4 billion
Retail declined 4.9 percent to $5.1 billion
National was down 2.5 percent, coming in at $1.7 billion
Within the classified print category in the third quarter:
Real estate advertising fell 24.4 percent to $1 billion
Recruitment dropped 19.7 percent to $882.4 million
Automotive was down 17.7 percent to $796.6 million
All other classifieds were up 2.7 percent to $713.3 million
NAA President and CEO, John F. Sturm, concludes that
"Newspaper Web sites continue to generate substantial revenue by offering advertisers access to the nation's most desirable group of consumers... (while) broader economic issues are (negatively) impacting... real estate, recruitment and retail advertising..."