Tough question, "What is SEO?". As with pretty much all internet-related terms, concepts and notions, that of "SEO" does not have a unique definition and it is a blurry concept in most people's mind.
Still, what is SEO? Since there is no ultimate, fully comprehensible definition for "SEO", the only way to go is to take a look at several definitions and try to merge them in order to have the right perspective.
Wikipedia : "Search engine optimization (SEO) is a set of methodologies aimed at improving the visibility of a website in search engine listings. The term also refers to an industry of consultants that carry out optimization projects on behalf of client sites."
Fakezilla : "The changes that are made to the content and code of a web site in order to increase its rankings in the results pages of search engines and directories. These changes may involve rewriting body copy, altering Title or Meta tags, removal of Frames or Flash content, and the seeking of incoming links."
The Web Search Workshop : "The term used to describe the marketing technique of preparing a website to enhance its chances of being ranked in the top results of a search engine once a relevant search is undertaken. A number of factors are important when optimizing a website, including the content and structure of the website's copy and page layout, the HTML meta-tags and the submission process."
6am Media : "The process of improving web pages so that it ranks higher in search engine for targeted keywords with the ultimate goal of generating more revenue from the web site. There are many SEO techniques. In general, these techniques can be categorized as On-Page Optimization, On-Site Optimization, and Off-Site Optimization. There are also two schools of SEO: white hat SEO and black hat SEO. White hat SEOs are those that play by the rule (actually guidelines provided by search engines). Black hat SEOs are those that push the limit of SEOs and employ some questionable or prohibited techniques (according to the guidelines). These black hat SEO techniques are also commonly known as spam."
Website NOVA : "acronym for search engine optimization. This is the process of making a website 'search-engine-friendly. Search engine optimization is primarily used to increase rankings in SERPs, and effective SEO can increase the potential of your website and bring in more traffic."
Thousands more definitions are available, almost as many "SEO guru's" you will find online ("The Guru Problem" is actually the title of another article to be published soon).
As you can see, no definition is like another, but they all tend to converge to a certain common understanding.
There are numerous techniques and tools used to achieve SEO goals, and they should NOT be included within a definition.
Since it is not correct to define a concept through its tools, here is a definition I have come up with after long deliberations. The suggestion is to define SEO as follows:
SEO = abbreviation for "Search Engine Optimization", the process of optimizing and tuning a web site and gaining online awareness for it, in order to deliver targeted visitors and ensure high conversion rates.
When done correctly, SEO activities must:
- make search engines crawl the site;
- make search engines index the site;
- ensure a high ranking among SERPs (Search Engine Results Pages) for given keywords;
- achieve a high page rank;
- drive targeted traffic;
- achieve high conversion rates among the site's visitors.
Since nothing is definitive and ultimate in the world of SEO, I'd like to receive your feedback and comments: TeaWithEdge.com is a way to contact me and speak up about your Marketing concerns.
Learn how to optimize a Dynamic Website. What are the problems that search engines face in indexing Dynamic URLs? And what are the search engine optimization techniques we can use for for Dynamic URLS?
Dynamic Websites Search Engine Optimization
Dynamic websites are websites whose pages are generated on the fly. Unlike static pages (primarily .htm/.html pages), dynamic pages are generated when an user triggers an action through that particular page.
Here is a sample dynamic URL-
As per the above example of www.bbc.co.uk, the dynamic part (i.e. the part) of the URL which changes as per surfer request is the part after the question mark (?)
What are the problems that search engines face in indexing Dynamic URLs?
1. Search engines often consider a dynamic URL as an infinite set of links.
2. Since dynamic URLs find maximum application in online shopping carts, there is a possibility of incorporating a session id to a particular page. As session ids of that particular page change, the search engine spider needs to index an infinite number of copies of the same page, which is a Herculean task for them.
3. Proceeding with the same logic presented in point # 2, indexing the same dynamic page might overload the servers of the search engines and therefore prevent the search engines to present with the most relevant information in the fastest possible time.
Here is what Google says about indexing of dynamic websites -
Reasons your site may not be included: Your pages are dynamically generated. We are able to index dynamically generated pages. However, because our web crawler can easily overwhelm and crash sites serving dynamic content, we limit the amount of dynamic pages we index. (Source - http://www.google.com/webmasters/)
What are the options that you have in order to make a search engine spider index your Dynamic URLs?
1. Use of softwares - Exception Digital Enterprise Solutions (http://www.xde.net) offers a software which can change the dynamic URLs to static ones. Named XQASP, it will remove the "?" in the Query String and replace it with "/", thereby allowing the search engine spiders to index the dynamic content.
http://www.my-online-store.com/books.asp?id=1190 will change to
The latter being a static URL, it can easily be indexed by the search engine spiders.
2. Use of CGI/Perl scripts - One of the easiest ways to get your dynamic sites indexed by search engines is using CGI/Perl scripts. Path_Info or Script_Name is a variable in a dynamic application that contains the complete URL address (including the query string information). In order to fix this problem, you'll need to write a script that will pull all the information before the query string and set the rest of the information equal to a variable. You can then use this variable in your URL address.
Example - http://www.my-online-store.com/books.asp?id=1190
When you are using CGI/Perl scripts, the query part of the dynamic URL is assigned a variable.
So, in the above example "?id=1190" is assigned a variable, say "A". The dynamuc URL http://www.my-online-store.com/coolpage.asp?id=1190
will change to http://www.my-online-store.com/books/A through CGI/Perl scripts which can easily be indexed by the search engines.
3. Re-configuring your web servers -
(i) Apache Server - Apache has a rewrite module (mod_rewrite) that enables you to turn URLs containing query strings into URLs that search engines can index. This module however, isn't installed with Apache software by default, so you need to check with your web hosting company for installation.
(ii) ColdFusion - You'll need to reconfigure ColdFusion on your server so that the "?" in a query string is replaced with a '/' and pass the value to the URL.
4. Creation of a Static Page linked to an array of dynamic Pages -
This approach is very effective, especially if you are the owner of a small online store selling a few products online. Just create a static page linking to all your dynamic pages. Optimize this static page for search engine rankings. Include a link title for all the product categories, place appropriate "alt" tag for the product images along with product description containing highly popular keywords relevant to your business (You can conduct keyword research for your site through http://www.wordtracker.com). Submit this static page along with all the dynamic pages in various search engines, conforming to the search engine submission guidelines.
How Amazon.com, Earth's Biggest Bookstore, coped with the issue of indexing of dynamic URLs?
A search in Google for internet marketing books, yielded a result that takes you directly to the appropriate dynamic page at Amazon: http://www.amazon.com/exec/obidos/ISBN%3D0395683297/103-0475212-8205437.
Since the above URL does not contain any query strings, all search engines can index Amazon.com's products. Amazon.com uses this method to get its product selections indexed by search engines. This is very important for Amazon, because being an online bookstore, it is very natural for them to adopt dynamic URLs yet it was equally important for them to make their dynamic URLs search engine index friendly.
Even a few years back, most of the major search engines did not index dynamic URLs, thereby often preventing top search engine rankings for the online stores. With Google starting to index dynamic URLs a few months ago, the picture is going to change in the coming days. This is more so because Google's numero uno position is currently being threatened by Microsoft's MSN (developing its own search engine) and Yahoo! who recently acquired Overture, the biggest player in the PPC Search Engine industry.
Article Source: http://www.articledashboard.com
As posted on Andy Beal's blog, there are a number of new fake SEO forums, that would seem to discredit reputable SEO firms, while at the same time promoting lesser-known individuals.
Law firm Girard Gibbs & De Bartolomeo, LLP from San Francisco is investigating a potential class action lawsuit against search engine optimization firm Traffic Power.com operating out of Las Vegas, Nevada. The suit is apparently based on numerous consumer complaints about improper business practices.
The SEO Consultants Directory have been commissioned by a Consortium of anonymous business owners, past and present Traffic-Power clients and a few industry representatives to research, assemble and publish a collection of online documents that refer to Traffic-Power.com.
The documents are extremely detailed and include alleged reports of Traffic-Power's estimated revenues, executive staff and photographs of a toga party held to celebrate the company's achievement of $100K in revenue earnings in a single week and over $300K in a four week period.
Current and former clients of Traffic-Power are being encouraged to read the documentation and get in touch with the law firm directly for further information. Additionally, it is recommended they keep posted on the Traffic Power issue by visiting http://www.trafficpowersucks.com/.
In the last few weeks, Traffic Power has apparently changed its name to 1P.com. What's really curious in this is that, strangely enough, the two new forums appear to be bad-mouthing many reputable SEO firms, while at the same time promoting themselves. One such forum is SEO Professional Forum at http://seo-professional-forum.com/.
As a whole, forums are great and are meant to help people better understand the world of search engines and how to optimize a website. However, the two above-mentioned forums seem to lack a lot in objectivity.
Here are a few examples of what has been recently posted:
As various search engines offer analysis into the ROI and performance of marketers' keyword ad campaigns, some search marketers worry Google and Yahoo can obtain too much sensitive information about advertisers' businesses.
Last month, a customer questionnaire from Yahoo's Overture Services paid search unit ignited concern regarding its suggestion of a subscription program that would give insight into competitors' marketing tactics, including price per click, ad budgets and conversion information.
"We do not currently have any plans to use our conversion data for other purposes," said Gaude Paez, an Overture spokeswoman. "We're not planning on using the conversion data we get from advertisers and putting it into reports."
While many privacy advocates have raised issues related to search engines' access to consumer information, some marketers warn that search engines have access to a vast trove of sensitive business data. The idea of sharing sales information with marketing companies that also serve competitors leaves some marketers uneasy.
"The fact of the matter is Google and Overture already have more information than the average person wants them to have," said Lisa Wehr, president of Suttons Bay, MI, search engine marketing firm Oneupweb, which runs paid listings campaigns on Google and Overture. "I don't think most marketers understand the impact of how they can predict their businesses."
Many sophisticated marketers turn to search marketing firms and third-party tools to manage and measure their keyword campaigns. Google and Overture offer free tools for measuring how paid search ads in their network convert to sales.
These tools, geared to the tens of thousands of small and midsize paid search advertisers, work through the insertion of a tracking pixel on their confirmation page to record when a lead converts. The marketer is provided valuable sales data on its keyword campaign, but the data also go to the search provider.
Neither Overture nor Google would say how many marketers use their conversion tools.
Joshua Stylman, managing partner of New York search marketing firm Reprise Media, said conversion tools are very helpful for small marketers, who too often track their search campaigns only by clicks. However, search engines could easily use sales information to raise prices on high-performing keywords, he warned.
"We're relying essentially on finding inefficiencies in the market and exploiting them," he said. "Once those inefficiencies become visible, they're not as valuable."
Terms and conditions for advertisers posted on Google's and Overture's Web sites give the search providers the right to use advertiser information, including conversion data, on an aggregate basis.
Stylman said the differentiation between individual and aggregate data is not that important. For example, a list of keywords with a high conversion rate but low bid price for the mortgage industry would be just as valuable as those for a particular company.
"We see an inherent conflict in the person that owns the inventory having full visibility into the performance data," he said.
Paez said Overture does not use conversion data in pitching clients through its own sales force or for its keyword suggestion tool. She said Overture uses conversion data to develop new advertising products, such as its new keyword matching options.
A Google spokesman said the company uses aggregated conversion data for "general quality and business-related analyses." Its Smart Pricing effort to normalize pricing on keywords that convert differently adjusts click prices on Google's content listings based on several factors, including how well they convert.
Other search marketers say they do not think the data funneled through Overture and Google are a concern, because any misuse would alienate their advertiser base.
"Ultimately, their open auction model determines their fortunes," said Fredrick Marckini, chief executive of iProspect, a Watertown, MA, search marketing firm.
The need to gauge return on investment for search marketing is likely to grow, according to Jupiter Research. It is also likely to lead to more demand for conversion tools. Jupiter Research expects rising click prices will force more marketers to focus on wringing efficiencies from their search campaigns. A Jupiter Research survey found less than half of marketers use sales data to measure the success of their search campaigns.
"The whole point of increasing your efficiency, as a marketer, is to earn more money and get better margins," said Nate Elliott, a Jupiter Research analyst. "If you let the media sellers know how efficient you are, they can just raise prices on you and take away all the benefits from that work you've done."
Source: DM News
Ken Abbott knows the ins and outs of search engine marketing: Dollars for clicks are in, directory listings are out.
Abbott, head of Web marketing for Integramed.com, which sells infertility treatments, considers obtaining an editorial listing in Yahoo's directory "a waste of time," given that 95 percent of his site's traffic comes from pay-per-click advertising in search networks Overture and Google.
"I paid for reviews with all the directories at the beginning of my marketing initiative (in 1999), but being in a directory is meaningless now unless you can rank on the first page," he said. "I have complete control with pay-per-click advertising--I can get to the top, I control the headline and message--whereas with a directory listing, it's what (Yahoo editors) decide to write."
Once the primary road signs to navigating the Internet, directories have moved to the shoulder. They are being displaced by algorithmic search tools and commercial services that many people--Abbott among them--now believe do a better job in satisfying Web surfers and advertisers. The transformation is bringing to an end an altruistic era of human editors, who once wielded significant clout in driving traffic to Web sites through recommendations made without regard for commercial considerations.
The transition has sparked a power shift in the search world that is forcing directory leader Yahoo to reinvent its search business to better compete with an uprising of algorithmic and commercial search providers, most notably Google and Overture Services. In response, Yahoo over the past year has continued to distance itself from its noncommercial directory roots, adding paid search links from Overture, demoting directory listings on its search pages to results provided by Google and scooping up algorithmic search provider Inktomi.
The recent flurry of activity at Yahoo has company watchers wondering what the future holds for the portal's search tools, and what place, if any, there might be for its once dominant directory. "In October, Yahoo made the directory secondary to Google," said Danny Sullivan, editor of the industry newsletter Search Engine Watch. "Suddenly the value of getting listed in Yahoo seemed to disappear. Now, if you're not listed with Yahoo, it may not matter."
Clearly, it pays to be in the paid search business. Yahoo's deal with Overture has helped it achieve three consecutive quarters of profitability and has allowed management to boost financial expectations. Yet the success in partnering with commercial search providers raises the inevitable question, why pay dozens or hundreds of people to search the Web when another company wants to pay you to do the same?
Yahoo keeps the operations of its search editors close to its vest. Company executives will not comment on how many editors it employs. A Yahoo representative said "on average" the company employs "a building full" of directory editors. At least one search engine marketer has said that Yahoo has scaled back on its directory editors slowly over recent months, giving people new duties or emphasizing paid search listings.
But Srinija Srinivasan, Yahoo's vice president and editor in chief, denied the company has recently laid off or redeployed members of its editorial staff. "The state of search technology thankfully has improved," Srinivasan said in an interview. "That said, we firmly believe there continues to be a gap between the best technology and what we can provide incrementally with the human experience."
Conceived by co-founders Jerry Yang and David Filo in a Stanford trailer in 1994, much of Yahoo's popularity was built on the directory's ability to give order and organization to the unruly Web. As legend has it, Yahoo was developed by Yang and Filo as a way to categorize their favorite sumo wrestling Web sites. Even the company name--originally the acronym "Yet Another Hierarchical Officious Oracle"--highlighted its directory roots.
Unlike the other search competitors that emerged in the mid-1990s, such as Excite, Lycos, Infoseek and AltaVista, Yahoo did not develop its technology to crawl through millions of Web sites. Instead, it hired humans to manually search the Web to find, organize and review sites about thousands of topics. Yahoo's editorial team became an emblem of the Internet's rise where legions of college graduates would do the heavy lifting to help Web newbies find what they want.
Yahoo did not rely exclusively on its directory, signing partnerships over the years with algorithmic search engines such as AltaVista, Inktomi and Google to provide backup results. But up until October, these third parties were never the centerpiece of Yahoo's search results. In late 1999, Yahoo began to tinker with its coveted directory service. Under then Yahoo CEO Tim Koogle, the company launched a new fee plan, requiring sites to pay $299 a year--$600 for adult sites--in order to be considered for inclusion in its directory listings. If accepted, companies would be required to pay the fee annually to retain their listing.
Yahoo does not break out results for this business, but said it has no plans to discontinue it. While many analysts called the move to paid inclusion overdue at the time, others believe the decision hurt the directory's credibility. "One has to wonder how the economic interests of search is messing with the altruistic agenda of the directory," said Lance Loveday, president of Web marketing consultancy Closed Loop Marketing.
Altruism vs. cash
Yahoo isn't the only directory facing criticism these days. Search engine marketers also point to the Open Directory Project (ODP) as an example of how far directories have fallen behind algorithmic search providers, both in terms of the reach and quality of the results they provide. AOL Time Warner-owned Netscape runs the ODP, which launched in June 1998 under the name NewHoo. It uses about 210,000 volunteer editors to catalog the Web, many of whom are search engine marketers. Though Google and AOL both draw on this directory for specialized searches, the service is thought to be plagued with troubles. Hardware failures over the winter holidays caused the directory to be out of commission for several months, for example.
Elisabeth Osmeloski, a search engine marketer and a volunteer editor with ODP, said that more than 50 percent of the sites submitted for review are spam links, causing a major sap to volunteers' time. "The ODP has a huge backlog from bad submissions; there are sites waiting for two years to be reviewed. It would be better if Netscape got (more) behind it," she said.
Bob Keating, editor in chief of the ODP, said that it's fighting the good fight against the encroachment of the profit motive into what is rightfully an editorial process. "We're trying to combat the commercialization of search," said Keating, whose ODP has a catalog of 3.8 million sites, compared to some 4 billion for Google. "A lot of Web directories have gone the other direction. As search gets even more commercialized, the Open Directory is the only one that's left that's really grounded in the original concept of the Net--that it's an information source and not a money-making vehicle."
He added that the ODP has checks and balances to stop spammers from controlling the directory. LookSmart, which launched in October 1996 and was originally backed by Reader's Digest, started as a directory of the top sites in any category on the Web. It employed hundreds of editors and writers to handpick sites. Now, it employs about 100 editors, but they largely review commercial sites that have bought into the directory, which is licensed to Microsoft's MSN. It also runs a noncommercial directory called Zeal.com that is staffed with about two or three editors and a team of volunteers.
Many small sites say they still see value in a listing in directories like Yahoo's, LookSmart's and the ODP because such links are given weight by Google's PageRank, a system for evaluating the popularity of a Web page. Google rides on the back of human-screening of Web sites, and many people see directory links as an easy step on the road to popularity in major search engines.
Tim Mayer, vice president of Fast Web Search, which was recently acquired by Overture, said that directories are largely treated like any other link on the Web, and some may be thought of as more authoritative than others. But he said the influence of the directories has faded as they have become more commercial. "The more authoritative the site or directory is that is linking to your site, the more weight given in the link popularity," said Mayer, adding that link popularity is only one feature of many in the relevance algorithm. Still, he said, it's less important "in many search engines in the past years as directories have moved from purely editorial to pay-for-play."
Story by Stefanie Olsen and Jim Hu
Source: CNET News.com
Amazon's A9.com offers both a Web site and an Internet Explorer toolbar from which users can enter search terms.
The service, in test mode for now, is operated by a Palo Alto, Calif.-based subsidiary and branded separately.
Searches also can be limited to just Amazon.com products -- as well as the text of books available at Amazon.com.
A9's service relies heavily on Google, which supplies many of the search results, and Amazon's Alexa subsidiary, which provides traffic, related sites and other information on specific Web sites.
Search results also include text ads from Google's sponsored links program. Alison Diboll, an A9 spokeswoman, declined to say whether the company eventually plans to create its own search technology. She confirmed Amazon plans to use the technology both for its online store and the rest of the Web.
''Having this e-commerce search technology as a separate company is part of Amazon's continuing development from an online retailer to a technology services company,'' she said.
Unlike other Internet search tools, users sign onto A9.com with a user name and password from their regular Amazon.com account. A9 also offers an anonymous site that does not require a user name and password.
Source: Chicago Sun
Copernic today announces an agreement with InfoSpace, to offer Copernic’s Enterprise Search indexing technology for site search through InfoSpace’s network of search distribution partners.
InfoSpace and Copernic will offer online businesses the power to affordably integrate Web and site search into their Web sites.
This allows visitors to do broad Web searches as well as search for information contained on a particular site – without ever having to leave the originating site – to quickly find the information they’re looking for. By offering Web and site search from the same search box, InfoSpace and Copernic are enabling businesses to retain customers on their sites, creating new opportunities for increased sales.
InfoSpace’s private-label search platform enables online businesses to quickly deploy and monetize Web search at their site under their own brand.
The InfoSpace solution combines a scalable and flexible co-branding architecture with the company's award winning metasearch technology, which searches all the leading engines and returns only the best results. As the only provider to combine all the leading paid placement engines, InfoSpace offers its partners the highest match rate and revenue per query available.
The new offering also leverages Copernic’s leading enterprise search technology, which was designed from the ground up to be flexible – meeting the cost, complexity and deployment requirements of all types of companies.
Together, InfoSpace and Copernic are offering customers a cost-effective, easy-to-use and deploy, end-to-end Web and site search that returns highly relevant results in a flash – enabling customers to easily find the information they need in a single search setting, and ultimately offering a better online experience.
"Under this agreement, InfoSpace makes it easier for its private-label search partners to increase customer revenue. From a single search box, customers will be able to locate relevant results from both their favorite site as well as the Web.
By generating more site page views, marketers will have new opportunities to monetize traffic," said Richard Pelly, InfoSpace Vice President for Distribution Sales and Strategic Accounts. "Copernic site search offers exceptionally accurate results and can easily scale to meet the needs of the largest sites. When matched with InfoSpace Web search, it delivers a great search solution for high-traffic sites."
“Our partnership with InfoSpace, a major force in this market, further validates the power of our technology, helping us to achieve our goal to make search accessible to every company by meeting the price, scalability, ease-of-use and implementation requirements of all types of businesses,” said Martin Bouchard, Copernic’s president and CEO.
“InfoSpace’s leadership in the market is a perfect complement to our ongoing partnership strategy and only works to reinforce our recent office expansion and growing customer base in the United States. We look forward to an ongoing relationship with InfoSpace.”
Source: Copernic Inc.
Over the past week MozDex, an open source search engine, has been tweaking and refining its search results while in beta testing.
MozDex is the brainchild of Byron Miller and is built on an Open search system using different open source technologies. MozDex plans on full indexing of the Internet in the next upcoming weeks.
Mozdex.com offers the firsts OPEN search system based on publicly available software, APIs and algorithms, said Byron Miller, President at Small Productions. There is no secrecy into understanding the results or ranking thereof offering the first public insight into an open index.
What does Open Search Mean?
Searching the Internet has always been something people just did without ever wondering how it worked. Mozdex.com offers the insight into the search results that allow people to fully understand how they are ranked and displayed.
The test index itself is seeded from the dmoz.org directory, Netscape’s open source volunteer edited directory. However, MozDex - being a true search engine- is building its own index and allows web site users to submit via an Add Url form.
Adding to the Open Source buffet that MozDex is powered from, mozDex also uses Nutch open source search engine and archive.org spider technologies to power its search.
The site is at www.mozdex.com
Review by Loren Baker
Mr. Shi said that he had downloaded Baidu's Search Partner to facilitate data searching and information browsing, however after several days, when browsing a website, he was directed to install the Network Real Name Software.
According to a new lawsuit in Beijing, a Mr. Shi has accused Internet company 3721 of infringing upon his rights. Mr. Shi claims that after he installed 3721's Network Real Name Software on his computer, not only was the existing Baidu software on his terminal deleted, but also he found that searches for information from other websites were being illegally monitored and shielded.
Baidu is 3721's rival in the heated Chinese search engine industry.
He clams this is a violation of both his privacy and his right to voluntarily choose software.
As a result, Shi brought lawsuits against 3721 at Beijing's No. 1 Secondary People's Court and asked for RMB1000 (about US$130) in compensation.
After he followed the online instructions, he found that the previously-installed Search Partner software, Baidu's icon, and also the menu in the Microsoft Internet Explorer browser's address column had all been deleted.
He said it was impossible to re-install Baidu's software, and he claims 3721's software illegally prevented him from doing so. In addition, Shi's lawsuit states that whenever he browsed any websites, the software was continually monitoring and screening his actions.
Shi's lawsuit maintains that 3721 seriously affected his legal access to Internet information and impinged upon his legal use of other relevant software. In addition, it is said that 3721 did not clearly state the functions and results of its software, infringing upon his right to a full disclaimer.
3721 has said that it has done its part to notify users of the potential problems as well as solutions related to the software. The company has no other comments.
In recent years, more Internet consumers are using China's emerging consumer protection laws to protect them from poor product manufacturing and delivery.
Last year a consumer in Beijing successfully sued an online game company because the company caused the user's online game assets to disappear.
The 3721 case is currently underway.
Source: China Tech News.com
"Our biggest differentiator (from Google) is we have all this local content. So if (you) go to Google today and search for a restaurant in Kew, you might get two restaurants.
Andrew Day doesn't wear a tie to work and embraces the open plan, casual-dress office, giving the impression of operating like one of the dotcom kings of the past. It's therefore ironic he's about to take on the biggest names in the online world, like Google and Yahoo.
Day is chief executive of Sensis, Telstra's powerhouse directories arm. He is about to take his company - until now, primarily a print-oriented business based around the White and Yellow Pages - and create a new online search engine in the quest for more revenue.
For the first time, Day is willing to put a time frame on when his company is to take on Google: the next six to nine months, as a new search brand is wheeled out.
"Print is really our core, but the next frontier for us is search. We've really been in the search business for 50 years," said Day.
"The Yellow Pages is a search book. We don't call it that because it's not trendy. "But we will start to go into a more Google search environment. This is for people that don't want to look in Yellow Pages online because they've always searched through Google or Yahoo.
"If you go to our future search vehicle and we put all the White and Yellow pages, CitySearch and Whereis content on it, plus internet content, the same search in Kew will get a much bigger reply" - marrying local search and internet search together in one search capability.
"We expect we'll have something in the marketplace in the next six to nine months." After the service has been introduced to the internet, Sensis will offer the same service in voice (by calling an operator), mobile phone (your phone will recognise your position and send you a list of the 10 nearest Malaysian restaurants) and, eventually, interactive television.
"We will be looking like a Google, but an Australian Google," according to Day. Although Sensis is a subsidiary of Telstra, it is certainly well cashed up to expand.
Last financial year it took $1.2 billion in revenue, and recorded earnings before tax of $650 million. Day believes it is possible to multiply that earnings result by 10 to create an estimate of Sensis's current market value, which would put it in the top 20 companies on the Australian Stock Exchange if it were listed.
Its financial success has led many to conclude that, if for political reasons the second half of Telstra cannot be sold, perhaps floating Sensis would be a good second option to extract value out of Telstra.
As far as Day is concerned, a Sensis float is not on the agenda. "The issue of a float is really a Telstra issue. I don't think they have any plans at all," he said.
It might surprise some to know that Sensis, which Day says gets 99.5 per cent of its revenue from advertising, receives 13 per cent of total Australian media spending. In the online world it attracts almost 25 per cent of total advertising dollars.
While Sensis's $1.2 billion in revenue still puts it behind the amount of advertising revenue that either newspapers or television enjoy, Day gives the impression he'd like to be No.1 in the medium term.
"We'll close the gap because we do have strong growth in print and we do have stronger growth online than most newspapers in Australia would have today," he said.
Source: The Age.com
Startups and leading tech companies, including search exemplar Google, are tinkering with new ways of culling and presenting information - ones that could prompt the next revolution in search.
One company has an idea for how search engines can catalogue the Web more completely. Another believes it can better divine what a searcher wants. Yet another is trying to synch all that with how the human brain works.
"Because information is exploding, (the Internet) is going to become increasingly difficult to use if we don't get it right," said Liesl Capper, chief executive of Australian search startup Mooter.
Current technology troubles users like private investigator Cynthia Hetherington. When she suspected an Australian company recently of possible fraud, Hetherington turned first to Google. But then she went to the Australian Securities and Investments Commission, LexisNexis and Dun & Bradstreet.
Users who consider Google exhaustive are only fooling themselves, experts say. Today's search engines may be capturing as little as 1 percent of the Web, largely because of how they find and index online resources.
"It's very frustrating," said Hetherington, who runs a Haskell, N.J. company. "It's like going to a library and only pulling one book off the shelf."
Search analyst Danny Sullivan sees promise in developments to address such flaws, and he believes tomorrow's search engines are likely to blend the best.
But he also cautioned that the Internet is littered with search innovations that failed to draw investors or market share.
Currently, all search engines fail to capture the bulk of the "invisible Web" - resources locked up in databases and inaccessible by the engines' indexing crawlers. These include regulatory filings at the U.S. Securities and Exchange Commission, detailed reports on charities at GuideStar and complete archives of most newspapers.
Sometimes, accessing an "invisible" database requires payment. Search engines can't let you know about a document's availability for purchase if they can't scan it in the first place.
But even when a database is free, a site may require registration, prohibit search crawlers or use incompatible formats.
In particular, crawlers are stymied by dynamic Web pages, which are customized as users choose various options, such as car color at Cars.com.
To counter that, Chicago-based Dipsie Inc. is developing software that promises to fill out Cars.com's simple online forms, which are based on multiple choice, though not the complex ones for the government's patent and trademark databases, which require typing in keywords. A public test version is expected by summer.
Other companies are working to capture sound and video files that have troubled text-based crawlers.
StreamSage Inc. uses speech-recognition technology to transcribe feeds, so a search engine can pull out relevant portions of a long presentation. Company president Seth Murray said Harvard's medical school and NASA already use the technology, but engineers still must speed it up for broader use.
Yahoo Inc. (YHOO) is going a less technical, more controversial route: Businesses can pay to ensure that their "invisible Web" pages get indexed.
But indexing more of the Web only brings up another challenge - identifying the most relevant among the billions of documents available. So some search developers are focused on personalizing and organizing searches.
Eurekster Inc., a startup launched in January, is marrying search with social networking, in which friends, your friends' friends and their friends form online circles. Eurekster guesses what you're seeking based on what others in your circle have found relevant.
"At the moment, when you search on Google, everyone gets the same results for the same keywords," said Shaun Ryan, vice president of business development for Eurekster in New Zealand. "We try to personalize those results."
So a search for "casting" might produce sites on movies if your circle is heavily in entertainment, fly fishing if members enjoy weekend outings.
The major search engines, meanwhile, are trying to localize results, Yahoo! and America Online having an advantage over Google because they already have billing or registration information on many users.
And sites like SuperPages.com are tagging data, so customers can search not only by city but by store hours or credit cards accepted. Adding "Saturday" to a Google search might get you a store that's closed Saturday, or it might indicate Saturday's hours.
Tags also help Factiva personalize its archives of 9,000 news sources, so an engineering team gets tech-heavy results, while the marketing department gets consumer-friendly documents.
"People don't want to be spending time searching and looking for things," said Clare Hart, Factiva's chief executive. "They want to be spending the time analyzing the information."
At Microsoft Corp., researchers are exploring ways to return specific facts rather than entire documents. A search for "Marilyn Monroe's birthday" would return an answer, "June 1, 1926," instead of sites on her famous "Happy Birthday, Mr. President" performance.
"We still have this library metaphor of 'Let me give you back a bunch of books that might help you,' ... rather than 'Let me go through the books for you and figure out what you're looking for," said Eric Brill, a senior researcher with Microsoft's AskMSR project.
Mooter tries to mimic the brain's organization methods by identifying underlying themes and grouping sites - a search on travel in Spain might separate hotels from warnings about terrorism. Mooter also attempts to refine results based on links a user visits.
Building the technology is expensive, and some experts believe the best tools may be developed by and reserved for pay services like Factiva and ChoicePoint Inc., which aggregates personal, financial and legal data from a variety of government and corporate sources.
But don't count Google out. It has hundreds of engineers in California, New York, India and soon Switzerland working to make searching better, most recently with localized searching.
Google's director of technology, Craig Silverstein, said the industry leader must keep innovating because search is bound to morph into something completely different within a decade.
"It will be something that we haven't even thought of yet," Silverstein said. He offered few details, but the Google Labs site offers a peek.
One project, Google WebQuotes, returns listings with comments from other sites to help you evaluate a site's credibility and reputation.
Article by Anick Jesdanun
Source: Associated Press
Christchurch software developer Eurekster has created a search engine tool which makes the task of tracking down information on the web more of a team effort.
The tool, also called Eurekster, refines searches and calls up results based on what friends and contacts in customers' online "social networks" have previously shown interest in.
Users sign up for a log-on and password and encourage friends and colleagues to also use Eurekster.
If a person's friends do a search on digital cameras and go to particular websites to seek product information or to place orders, then those sites will come to the top of the list if the customer later searches for information on digital cameras on the web.
Friends can help filter out a lot of the garbage dredged up by internet searches and time can be saved in the workplace by getting to more relevant resources faster.
Grant Ryan, chief executive of Eurekster, says social networking is the most common way of filtering information in "the real world", usually through word-of-mouth recommendations.
"You ask someone who you know and trust. We do it all the time. The people we choose to hang out with have a similar view of life. They pass on good information."
Mr Ryan says the internet industry's major players had put personalisation in the "too hard" basket. Now they are likely to be taking notice.
He says Eurekster's approach is the next logical step from existing methods of ranking search results. These can involve search engine staff deciding what information on the web is likely to be most relevant and automatic systems that weigh up the number of links to a site.
Popular search terms used by customers' acquaintances can also be displayed on customers' Eurekster home page.
Eurekster will perform a "regular" unaided search if customers don't log-on with a password or select that as an option.
Soon Eurekster will add tabs to the search engine so that – once logged in – customers can flick from a regular search engine view to the socially-filtered one, or to views that take account of the search actions of particular groups of friends or colleagues.
The tool is designed to ignore friends' searches related to pornography, to reduce the risk of people finding out more about their acquaintances than they would care to.
Eurekster sits on top of search engine results generated by Yahoo subsidiary Fast, which Mr Ryan says has a website index similar in size to market leader Google.
He says Eurekster could be implemented to work in conjunction with any search engine. It does not compete, but rather seeks to partner with major search engine providers.
The carrot is that Eurekster's technology will make customers' loyal to a particular search engine as they return to benefit from the personalised approach, he says.
Eurekster has been incorporated in the US and is jointly owned by New Zealand-based sister companies SLI Systems and Real Contacts.
SLI is 15 per cent owned by US media giant NBC. Most of the rest of the owners of SLI and Real Contacts are New Zealanders.
Mr Ryan says the group has patented some very specific pieces of intellectual property used in the product and has further patents being processed.
He says Eurekster has combined its parent firms' six years of experience in search technology and three years experience in social networking technology.
SLI Systems serves up 300 million Internet searches a month and Mr Ryan says it is likely to triple that volume in the next couple of months.
He says most of SLI's business at the moment involves handling keyword suggestions. For example, if a user puts the word "diabetes" into a search engine, SLI's software will suggest "glucose" and "insulin" as related search items.
Mr Ryan says Eurekster's technology will be offered as another added-value service for search engine customers and as an add-on for other web-based social networking services.
"Booble" is a new adult Website parodying Google and has hit the Net running, allowing Net surfers with a thing for porn to filter over six thousand hand-selected adult Internet content listings.
Booble is said to be the brainchild of a former Net executive, whose identity isn't yet known but who is based in New York and is putting his own money into the project, described as a "light-hearted parody of the world's largest and best-known search engine."
Google was said to be unavailable for comment on Booble as this story went to press. But Booble's mastermind says his site links to Google, "and so far we haven't heard from them."
This isn't exactly unprecedented in the adult Internet: for several years, Youho has continued its success with its parody of the original look and array of kingpin portal Yahoo. Booble's developer-founder, who asked for his anonymity to keep from being banned from his daughter's school softball games, according to Agence France Presse, aims for a site that's both fun and useful.
"I have a Web development operation where there is a bit of a frat boy atmosphere, so we stumbled on Booble," he told the news agency. "What was a bit fun and a joke became a business. People like it. It makes people smile. It's funny and I think it'll grow."
The sites to which Booble directs users will be filtered to exclude "illegal or extremely hardcore material," AFP said, with criteria for that to include whether the site is worth the price it charges and the quality of its images.
"It is fun, but there is a real story behind Booble in that it's hard to find good adult content," the creator told AFP. "There are about 25,000 adult sites and a fraction of sites in really major categories like movies and music and sports, causing a lot of clutter and confusion."
And he hopes users get the joke while having their adult fun. "We don't want to do anything is illegal," he told AFP. "It's a parody, it's funny and we're not out to confuse anybody so we hope they will take the joke in the spirit in which it was intended."
Source: AVN Online