Friday, September 16, 2011

Culturomics: Time Wave Zero, Web bot & Predictions 3b

How does Web bot stand up against the now more well known and government funded Culturomics?
Web bot is just that, an internet bot.  There are many kinds of internet bots, for a lot of purposes.  In case you are not acquainted with the term, internet bots are programs that run automated tasks on the internet.

Normally, these bots perform simple, repetitive tasks that would be tedious for humans.  The most famous example of a bot is the web spider or, web crawler that browses the web in a methodical, automated manner.  Google uses one all the time to improve and update its searches.  Wikipedia provides a good working explanation of what a web crawler does:
Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for sending spam).
The Web bot is actually an algorithm called ALTA (Asymmetric Language Trend Analysis).  ALTA was developed by two people, Clif High and George Ure, who have called themselves "The Time Monks."1  George Ure has a website called Urban Survival, his free site.  Mr. Ure also runs pay sites which deal with finances, politics, economics, etc., called peoplenomics.  Mr. Ure has an MBA in business and does business and marketing consultations.  Mr. High is a computer programmer.  He runs a website called Half Past Human.  He also has a forum at webbotforum.  We have a short bio on him,
Clif is a computer programmer. In the 90's he worked as a consultant to some large corporations. His field of expertise is as a database administrator. He is what is known as a guru of sorts on data storage and manipulation. Clif has worked on some large projects including phone company data, automotive industry data, and large software companies in the Seattle Washington area.2
We include a short video produced by the History Channel which gives an overview of the web bot program (ALTA).  If you cannot see the embedded video, here is the link: http://youtu.be/2UFQJx-TVIY.


So while mainline Culturomics focuses mostly on print news media and broadcasts, web bot concentrates and focuses on the Internet.  Also, Mr, High assumes that all human beings are psychic and therefore sense the future before it happens collectively.  This is a view that it shares with McKenna's Time Wave Zero.  We shall let Mr. High explain it in his own words,
We employ a technique based on radical linguistics to reduce extracts from readings of dynamic postings on the internet into an archetypical database. With this database of archetypical language, we calculate the rate of change of the language. The forecasts of the future are derived from these calculations. Our calculations are based on a system of associations between words and numeric values for emotional responses from those words. These 'emotional impact indicators' are also of our own devising. They are attached to a data base of over 300/three hundred thousand words. This data base of linked words/phrases and emotions is our lexicon from which the future forecasting is derived.3
According to Tony Sokolski ( professed friend of High),  "Clif noticed how language was stored and used as data was forming patterns. He developed programs to collect these patterns from internet forum chat rooms. Clif noticed that the words he gathered would give him a picture of events to come."4  High goes on to explain the process by which he comes up with his conclusions.
The beginning of all of our processing is the word. Or words, rather - actually excessively large amounts of them. Totals of words beyond all reason. These then are distilled down into a thick syrupy mass and placed in an inadequate visual display and from there interpretation proceeds. 
Our interpretations of the data sets that we accumulate are presented in the form of a series of reports which detail the interpretations of the changes in language and what we think that they may mean.5
High divides his data into categories he calls entities.  As far as we can tell he has eleven entities:

  1. Markets -  "Descriptor set for all trading including paper debts, commodities, global currencies, et cetera."
  2. Bushista - "Proxy for the cabal required to keep Geo. W. Bush in power"
  3. Populace/USA - "Proxy for the populace of the USofA. Defined broadly and representative of issues which rise to national prominence."
  4. GlobalPopulace - "Proxy for all populace groups *not* part of Populace/USofA. Division forced by emotional splits arising from anti-Bush reactions post October 2001."
  5. Terra - "Largest of the entities. Proxy for all biosphere related sub sets at all levels."
  6. Press - "Proxy for print, teevee, and most radio. Define with corporate sponsorship or funding as requirement. So increasingly *does* apply to some bloggers. Represents the conduit for 'official' messages at all levels. Also referenced as the 'global mediastream'."
  7. FreeStream - "Proxy for independent information exchange. Rarely discussed as it has potential for circuitous references."
  8. Space - "Proxy for official UFO, NASA, NSA, and other 'strange reports'from/concerning Space including local to earth environmental reports. But specifically excludes any 'officially denied' contexts for [space] related subjects."
  9. TPTB - "Proxy for The Powers That Be. Proxy for groups which may be loosely described as both 'elitist', and 'global' in thinking. Includes proxies for such as RCC, UN, OD, Illuminati, CFR, and others too numerous to detail here."
  10. SpaceGoatFarts (SGF) - "This entity is newly formed. While references to Space Goat Farts have appeared within Terra, sub set Space in the past, the data set now demonstrates a need for an entity to allow for the growth of 'unknown forces of all kinds from space which are terra/populace affecting'. Includes non-earth based, non-human/non-mammal intelligent beings usually considered as 'space aliens' or 'extraterrestrials'. Created during 1207 clean up."
  11. FuturePop - "This entity is forming from the 'self organizing collective' meme which has been in the ALTA reports for a number of years. The entity is a proxy for those [new forms] of [social grouping] which result from the aggregation of individual responses to the continuing biospheric degradation."
Meta Data Layers
High describes these layers as "linguistic concepts."  These "lexical structures appear in all the major entities listed above.  High now gives us the most concise explanation of what ALTA does,
Our process begins with internet software agents which read in vast quantities of text from the public, and commonly accessible areas of the internet. We hunt for any of the words which are used as 'descriptors' within our process to define a context. Please see graphic below this discussion. These 'descriptors' are representative words or phrases which are used to define the basic 'idea' or 'concepts' of interest at that point. 
Note we do NOT use 'conscious expressions' located on the internet, nor simple word counting techniques. We do not read emails. We only access publically posted texts on the internet seeking so much what is there, as what has recently changed there. So as a rule we concentrate our software agents on the forums and other community based sites.
High continues, 
What we do employ is a series of programming steps which reduce the text read down to 4/four digit hexidecimal integers which are themselves held in sets of 'inter linked linguistic discoveries'. These sets of hexidecimal integers are then aggregated along with information in a general sense as to where the text was located. The text returned is aggregated through further processing, producing a very large SQL based data base which is accessed by our prolog processing software. Once the data starts rolling in, the processing starts by associating the 'descriptor' with a whole group of 'values'. Together these form an 'aspect/attribute' coupling. In turn, the aspects/attributes are gathered into sets, and then humans examine these for various timing and manifestation clues.
It is made clear that there is a very critical "human" factor to this. They examine "timing" and "manifestation" clues.  The process is a lengthy one. 
...data gathering can take up to 3/three weeks to begin filling the data bases for processing. In total, in any given series , the data gathering will continue for an additional 3/three to 4/four weeks. Once the interpretation is begun, a series of reports are prepared to present our findings in an entertaining and informative manner.6
His respect for McKenna is unabashed.  You can hear the short statement he makes on Time Wave Zero.  If you cannot hear the audio portion, here is the link: http://youtu.be/giWYBN5mR4Q
We post a full interview that took place on the Truth Frequency radio show sometime in 2009. The readers may judge for themselves as to the value and power of this ALTA program.  No doubt there will be more of these programs and they increase in accuracy.  In this show, they discuss their past predictions and their fulfillment.  If you cannot see the embedded video, here is the link: http://youtu.be/oqGvdBPM0kA.


Well lastly, post for your examination, a set of videos comparing the Time Wave Zero model with the Web bot model.  If you cannot see the embedded video, here is the link: http://bit.ly/otvAnV.

No comments: