HomeDecorative ElementBlogDecorative ElementSEO
Boost Your Google Rankings: The Ideal Website According to Google
Unlock the secret of Google's ranking system. Empower your business to create an optimized website for better search engine visibility. Dive in now.

What does the ideal website look like to Google?

At Fruition we have a greater focus on statistics and the use of probability than any other SEO services firm. So what is statistics and how does it apply to SEO? Statistics is the study and presentation of data. Probability is the mathematical foundation of all statistical properties. In most cases we use statistics by making assumptions about our data. Common assumptions range from underlying distributions to equality of variance to the nature of our data being random. All assumptions are necessary for the underlying probability laws to apply to our data. Unfortunately in practice, our assumptions are often false. The reason our assumptions are so often false is the nature of the data we collected in the first place. Assumptions are like stereotypes, though they always include some amount of truth, you can never really know how true they are. Though we say we test our assumptions we are really just making sure that our assumptions are not patently false, we are not verifying that they are true. Translated into what this means for your website - at Fruition we knock out assumptions that are time wasters in SEO projects. From a statistical perspective, Search Engine Optimization provides wonderful data. We are not measuring human variables. Though Google will tell you that their ranking is about "good" sites, the truth is no computer can make a judgment call like determining if a site is "good" or what the "perfect" site is. A computer can only measure the presence and extent of different, predefined variables. An algorithm is simply a complex set of equations that incorporate many variables. Google has developed an algorithm that they constantly update to approximate their definition of a "good" site based on many, undisclosed, variables. To this end they often employ human reviewers of sites and incorporate their findings into their ranking results. In order to work with numbers, statistics requires data sets. One number is never enough. Rather, we look at groups of related numbers. These groups are often a single measurement made multiple times, like visits on a website. We could measure visits daily, and then compare those daily visits to the daily visits from another site. Then we can compare the two websites. Similarly, we can compare a single websites performance before and after a specified event. This event could be a known Google update (like Penguin, Panda, or Hummingbird), or simply to check to see if modification of the website have had a measureable impact. In searching for updates we can look at groups of sites and see how they perform across a certain date. When many sites have a similar reaction to a single day, something outside of the site has likely occurred. The Monday after Thanksgiving will show increased site traffic to many retail sites, the cause is the shopping season. When such a change is unexplainable, we can conclude that an update has occurred. So we look for dates where substantial decreases in site traffic have occurred across many websites to find unannounced Google updates. There are many variables that are likely a part of Google's algorithm. We can build statistical models to determine the relationship between these variables and site performance. The result is simply showing that these variables are related to, though not necessarily causing changes in site rank. Establishing which variables cause a site’s rank to improve requires studies. Studies can easily be done within single websites across multiple pages. By making the same change to a subset of the pages available and tracking the impact those changes have we can begin to build a profile as to the impact each variable has. As a result we can begin to determine not only which variables matter, but how they matter. This is the strength in Fruition’s SEO methodology! When we look at search results we are not seeing which sites are the best. Rather we are seeing the result of Google's algorithm being applied to the sites and search terms in question. These variables lack any human element; they are purely the result of numerical and probabilistic properties. Unlike other applications of statistical research at Fruition we find ourselves collecting data that many of our assumptions may truly apply to. If you’re interested in learning more about Fruition’s SEO methodology and how you can make your website look good to Google just click the contact us button. If you haven't tried the Google penalty checker you can try it to get a complimentary look into our in-house SEO tools.

Table of Contents
    Table of Contents+