Personal Data Protection & Mobile Security Solutions
 

CyberHood Watch Discussion With Marcus P Zillman On Bots – Deep Web Search – Cloud Computing – SEO & The Future Of The Internet

by dballard on October 28, 2009

We had a bit of audio difficulty on today’s show, however; Marcus P. Zillman provided our listeners with some extraordinary information. We asked Marcus at the end of the show what he thought the future of the Internet would be. Marcus went on to say that if you thought the last ten years were exciting, you had better buckle in because the next ten, you haven’t seen nothing yet. Download the show and fast forward (overlook the static) and find out what Marcus says about the future of the Internet.

Robot…Robot = Bot. Think back to your algebra classes when you were introduced to equations, which are algorithmic scripts, and are a set of commands to carry out a specific action. A Bot is your set of algorithmic scripts created to follow a specific sequence of commands to perform a particular function.

In the case of the CyberHood Watch partners, Dave & Bill, our first thoughts when we hear the term bot, conjures a malicious hacker sending out a scripted piece of software to look for any vulnerability on a computer connected to the Internet. Any PC found by the hackers’ malicious code is then taken over by the hacker for his personal use without the owner’s awareness. That PC, under the control of the hacker, is called a “zombie computer”. When the individual (hacker) has commandeered multiple PCs from executing his malicious software (“Bot”), he now controls a large number of computers (“botnets”), which under his personal control can be used for malicious purposes.

Marcus’s perspective is such a refreshing look at bots in a positive and proactive manner that it has personally altered how I will perceive bots, as a constructive and helpful application being used for a good cause.

Marcus, if you happen to read this post, is there a difference between “code”, “scripts”, “logarithms”, and a “bot” written by a programmer? When does a code, script, or logarithm become a bot?

What an interesting thought to create your own bots, “mini yahoos”, and your own search engine. Marcus calls them his “subject tracers” which he has created fifty-one of them to go out and bring back specific quality information, and he has made most of them available for free.

The common thought is if you can’t find it doing a Google, Yahoo, or Bing search then it doesn’t exist. Nothing can be further from the truth. Marcus made the comment that only five to ten per cent of the information on the Internet is searched and found. Imagine ninety to ninety-five percent of information on the Internet is still out there waiting to be found. This is where a “deep web search” becomes significant in finding that ninety percent of quality information missed by today’s search engines.

Did you ever think that a lot of information on the Internet doesn’t want to be found?  Research papers – data basis – professional papers (postscript extensions) are examples of high quality information available on the Internet that search engines miss.

Many of us have fallen into this trap, including myself, and that is information overload. Have you ever found yourself surfing the web finding great information, leading to more great information, to even more great information, and at the end of the day you have read a bunch of great information but has added little value to your specific niche.

That is why Marcus says when using the Internet for a specific niche it is very important to focus in on exactly what it is you are after, and write it down in “great big letters” and stick it to the front of your computer. There is so much information available you need to stay focused, and having a constant reminder will help you stay on task.

Marcus has spent a great deal of his professional career building bots for specific research, which he has made available on his websites. This is the information, which is only found through deep searches; the ninety to ninety-five percent of all the information on the Internet that all the major search engines miss.

by: david c ballard

Related Posts with Thumbnails
Print Friendly
Share

Facebook comments:

Leave a Comment

Previous post:

Next post:

powered by
Ticket Bar