Big Data at Work - Big Data Bootcamp: What Managers Need to Know to Profit from the Big Data Revolution (2014)

Big Data Bootcamp: What Managers Need to Know to Profit from the Big Data Revolution (2014)

Chapter 4. Big Data at Work

How Data Informs Design

When it comes to data and design, there’s a struggle going on. Some believe that the best designs come from gut instinct while others hold that design should be data-driven. Data does have a role to play in design and modern technologies can provide deep insight into how users interact with products—which features they use and which ones they don’t. This chapter explores how some of today’s leading companies, including Facebook and Apple, apply data to design and how you can bring a data-informed design approach to developing your own products.

How Data Informs Design at Facebook

If there is one company whose design decisions impact a lot of people, more than a billion of them, it’s social networking giant Facebook. Quite often, when Facebook makes design changes, users don’t like them. In fact, they hate them.

When Facebook first rolled out its News Feed feature in 2006, back when the social networking site had just 8 million users, hundreds of thousands of students protested. Yet the News Feed went on to become one of the site’s1 most popular features, and the primary driver of traffic and engagement, according to Facebook Director of Product Adam Mosseri.2

That’s one of the reasons that Facebook takes what Mosseri refers to as a data-informed approach, not a data-driven approach to decision making. As Mosseri points out, there are a lot of competing factors that can inform product design decisions. Mosseri highlights six such factors: quantitative data, qualitative data, strategic interests, user interests, network interests, and business interests.

Image Tip Rather than taking a strict data-driven approach to product decisions, consider taking a more flexible data-informed approach. That distinction has served Facebook well.

Quantitative data is the kind of data that shows how people actually use the Facebook product. This might be, for example, the percentage of users who upload photos or the percentage who upload multiple photos at a time instead of just one.

According to Mosseri, 20% of Facebook’s users—those who log in more than 25 days per month—generate 85% of the site’s content. So getting just a few more people to generate content on the site, such as uploading photos, is incredibly important.

Qualitative data is data like the results from eye-tracking studies. An ­eye-tracking study watches where your eyes go when you look at a web page. Eye-tracking studies give a product designer critical information about whether elements of a web page are discoverable and whether information is presented in a useful manner. Studies can present viewers with two or more different designs and see which design results in more information retention, which is ­important knowledge for designing digital books or building a compelling news site, for example.3

Mosseri highlights Facebook’s introduction of its Questions offering, the ­capability to pose a question to friends and get an answer, as an example of a strategic interest. Such interests might compete with or highly impact other interests. In the case of Questions, the input field necessary to ask a question would have a strong impact on the status field that asks users, “What’s on your mind?”

User interests are the things people complain about and the features and capabilities they ask for.

Network interests consist of factors such as competition as well as regulatory issues that privacy groups or the government raise. Facebook had to incorporate input from the European Union for its Places features, for example.

Finally, there are business interests, which are elements that impact revenue generation and profitability. Revenue generation might compete with user growth and engagement. More ads on the site might produce more revenue in the short term but at the price of reduced engagement in the long term.

One of the challenges with making exclusively data-driven decisions, Mosseri points out, is the risk of optimizing for a local maximum. He cites two such cases at Facebook: photos and applications.

Facebook’s original photo uploader was a downloadable piece of software that users had to install in their web browsers. On the Macintosh Safari browser, users got a scary warning that said, “An applet from Facebook is requesting access to your computer.” In the Internet Explorer browser, users had to download an ActiveX control, a piece of software that runs inside the browser. But to install the control, they first had to find and interact with an 11-pixel-high yellow bar alerting them about the existence of the control.

The design team found that of the 1.2 million people that Facebook asked to install the uploader, only 37 percent tried to do so. Some users already had the uploader installed, but many did not. So as much as Facebook tried to optimize the photo uploader experience, the design team really had to revisit the entire photo-uploading process. They had to make the entire process a lot easier—not incrementally better, but significantly better. In this case, the data could indicate to Facebook that they had an issue with photo uploads and help with incremental improvement, but it couldn’t lead the team to a new design based on a completely new uploader.

The visual framework that Facebook implemented to support third-party games and applications, known as Facebook applications, is another area the design team had to redesign completely. With well-known games like Mafia Wars and FrontierVille (both created by Zynga), the navigation framework that Facebook implemented on the site inherently limited the amount of traffic it could send to such applications. While the design team was able to make incremental improvements within the context of the existing layout, they couldn’t make a significant impact. It took a new layout to produce a dramatic uplift in the amount of traffic such applications received.

As Mosseri puts it, “real innovation invariably involves disruption.” Such disruptions, like the News Feed, often involve a short-term dip in metrics, but they are the kinds of activities that produce meaningful long-term results. When it comes to design at Facebook, data informs design; it doesn’t dictate it.

Mosseri highlights one other point about how Facebook has historically done design: “We’ve gotten away with a lot of designing for ourselves.” If that sounds familiar, it’s because it’s the way that another famous technology company designs its products too.

Apple Defines Design

If there is one company that epitomizes great design, it’s Apple. As Steve Jobs once famously said, “We do not do market research.”4 Rather, said Jobs, “We figure out what we want. And I think we’re pretty good at having the right discipline to think through whether a lot of other people are going to want it too.”

When it comes to the world’s most famous design company, a few things stand out, according to Michael Lopp, former senior engineering manager at Apple, and John Gruber.5

First, Apple thinks good design is a present. Apple doesn’t just focus on the design of the product itself, but on the design of the package the product comes in. “The build up of anticipation leading to the opening of the present that Apple offers is an important—if not the most important—aspect of the enjoyment people derive from Apple’s products.” For Apple, each product is a gift within a gift within a gift: from the package itself to the look and feel of the iPad, iPhone, or MacBook, to the software that runs inside.

Next, “pixel-perfect mockups are critical.” Apple designers mock up potential designs down to the very last pixel. This approach removes all ambiguity from what the product will actually look like. Apple designers even use real text rather than the usual Latin “lorem ipsum” text found in so many mockups.

Third, Apple designers typically make 10 designs for any potential new feature. Apple design teams then narrow the 10 designs down to three and then down to one. This is known as the 10:3:1 design approach.

Fourth, Apple design teams have two kinds of meetings each week. Brainstorm meetings allow for free thinking with no constraints on what can be done or built. Production meetings focus on the reality of structure and the schedules required to ship products.

Apple does a few other things that set its design approach apart from others as well. The company doesn’t do market research. Instead ­employees focus on building products that they themselves would like to use.6

The company relies on a very small team to design its products. Jonathan Ive, Apple’s Senior Vice President of Industrial Design, relies on a team of no more than 20 people to design most of Apple’s core products.

Apple owns both the hardware and software, making it possible to deliver a fully integrated, best-of-breed experience. What’s more, the company focuses on delivering a very small number of products for a company its size. This allows the company to focus on delivering only best-of-breed products. Finally, the company has “a maniacal focus on perfection,” and Jobs was said to dedicate half his week to the “high- and very low-level development efforts” required for specific products.

Apple is known for the simplicity, elegance, and ease of use of its products. The company focuses on design as much as it does on function. Jobs stated that great design isn’t simply for aesthetic value—it’s about function. The process of making products aesthetically pleasing comes from a fundamental desire to make them easy to use. As Jobs once said, “Design is not just what it looks and feels like. Design is how it works.”

Big Data in Game Design

Another area of technology in which Big Data plays a key role is in the design of games. Analytics allows game designers to evaluate new retention and monetization opportunities and to deliver more satisfying gaming experiences, even within existing games. Game designers can look at metrics like how much it costs to acquire a player, retention rates, daily active users, monthly active users, revenue per paying player, and session times, that is, the amount of time that players remain engaged each time they play.7

Kontagent is one company that provides tools to gather such data. The company has worked with thousands of game studios to help them test and improve the games they create.

Game companies are creating games with completely customizable ­components. They use a content pipeline approach in which a game engine can import game elements, including graphical elements, levels, objects, and challenges for players to overcome.8

The pipeline approach means that game companies can separate different kinds of work—the work of software engineers from that of graphic ­artists and level designers, for example. It also makes it far easier to extend ­existing games by adding more levels, without requiring developers to rewrite an entire game.

Instead, designers and graphic artists can simply create scripts for new levels, add new challenges, and create new graphic and sound elements. It also means that not only can game designers themselves add new levels but players can potentially add new levels, or at least new graphical objects.

Separating out the different components of game design also means that game designers can leverage a worldwide workforce. Graphic artists might be located in one place while software engineers are located in another.

Scott Schumaker of Outrage Games suggests that a data-driven approach to game design can reduce the risks typically associated with game creation. Not only are many games never completed, but many completed games are not financially successful. As Schumaker points out, creating a great game isn’t9 just about designing good graphics and levels, it’s also about making a game fun and appealing.

It’s difficult for game designers to assess these kinds of factors before they implement games, so being able to implement, test, and then tweak a game’s design is critical. By separating out game data from the game engine, it becomes far easier to adjust game play elements, such as the speed of ghosts in ­Pac-Man.

One company that has taken data-driven game design to a new level is Zynga. Well-known for successful Facebook games like CityVille and Mafia Wars, Zynga evaluates the impact of game changes nearly in real-time. Zynga’s game makers can see how particular features impact how many gifts people send to each other and whether people are spreading a game virally or not.10

By analyzing data, Zynga was able to determine that in FrontierVille it was too hard for new players to complete one of their first tasks, which was building a cabin. By making the task easier, a lot more players ended up sticking around to play the game. Although Zynga’s11 public market value has recently declined, there’s clearly a lot to be learned from its approach to game design.

Better Cars with Big Data

What about outside the tech world? Ford’s Big Data chief John Ginder believes the automotive company is sitting on immense amounts of data that can “benefit consumers, the general public, and Ford itself.”12 As a result of Ford’s financial crisis in the mid-2000’s and the arrival of new CEO Alan Mulally in 2006,13 the company has become a lot more open to making decisions based on data, rather than on intuition. The company is considering new approaches based on analytics and simulations.

Ford had analytics groups in its different functional areas, such as for risk analysis in the Ford Credit group, marketing analysis in the marketing group, and fundamental automotive research in the research and development department. Data played a big role in the company’s turnaround, as data and analytics were called upon not just to solve tactical issues within individual groups but to be a critical asset in setting the company’s go-forward ­strategy. At the same time, Mulally places a heavy emphasis on a culture of being data-driven; that top-down focus on measurement has had a huge impact on the company’s use of data and its turnaround, according to Ginder.

Ford also opened a lab in Silicon Valley to help the company access tech ­innovation. The company gets data from some four million vehicles that have in-car sensing capabilities. Engineers can analyze data about how people use their cars, the environments they’re14 driving in, and vehicle response.

All of this data has the potential to help the company improve car handling, fuel economy, and vehicle emissions. The company has already used such data to improve car design by reducing interior noise, which was ­interfering with in-car voice recognition software. Such data also helped Ford engineers determine the optimal position for the microphone used to hear voice commands.15

Big Data also helps car designers create better engines. Mazda used tools from MathWorks to develop its line of SKYACTIV engine technologies. Models allow Mazda engineers to “see more of what’s going on inside the engine,” and achieve better fuel efficiency and engine performance as a result. Such models allow engine designers to test new engine components and designs before creating expensive prototypes.16, 17

Historically, the challenge has been that internal combustion engines, which power most vehicles, have been incredibly hard to model. That’s because they are inherently complex systems. They involve moving fluids, heat transfer, ignition, the formation of pollutants, and in diesel and fuel injection engines, spray dynamics.

Designers are also using data analytics to make decisions about how to improve racecars, decisions that could eventually impact the cars that ­consumers buy. In one example, the Penske Racing team kept losing races. To figure out why, engineers outfitted the team’s18 racecars with sensors that ­collected data on more than 20 different variables such as tire temperature and ­steering. Although the engineers ran analysis on the data for two years, they still couldn’t figure out why drivers were losing races.

Data analytics company Event Horizon took the same data but applied a different approach to understanding it. Instead of looking at the raw numbers, they used animated visualizations to represent changes in the racecars. By using these visualizations, they were quickly able to figure out that there was a lag time between when a driver turned the steering wheel and when a car actually turned. This resulted in drivers making lots of small adjustments, all of which together added up to lost time. It’s not enough just to have the right data—when it comes to design as well as other aspects of Big Data, being able to see the data in the right way matters a lot.

Image Note Gathering Big Data is the easy part. Interpreting it is the hard part. But if you get it right, you will gain competitive advantages that will vault you to the next level of excellence and company value.

Big Data and Music

Big Data isn’t just helping us build better cars and airplanes. It’s also helping us design better concert halls. W.C. Sabine, a lecturer at Harvard University, founded the field of architectural acoustics around the turn of the 20th century.19, 20

In his initial research, Sabine compared the acoustics of the Fogg Lecture Hall, in which listeners found it difficult to hear, with the nearby Sanders Theater, which was considered to have great acoustics. In conjunction with his assistants, Sabine would move materials such as seat cushions from the Sanders Theater to the Fogg Lecture Hall to determine what impact such materials had on the hall’s acoustics. Remarkably, Sabine and his assistants did this work at night, taking careful measurements, and then replacing all the materials by morning so as not to disturb the daytime use of the two halls.

After much study, Sabine defined the reverberation time or “echo effect,” which is the number of seconds required for a sound to drop from its starting level by 60 decibels. Sabine figured out that the best halls have reverberation times between 2 and 2.25 seconds. Halls that have reverberation times that are too long are considered too21 “live,” while halls that have reverberation times that are too short are considered too “dry.”

The reverberation time is calculated based on two factors: the room volume and the total absorption area, or the amount of absorption surface present. In the case of the Fogg Lecture Hall, where spoken words remained audible for about 5.5 seconds, an additional 12 to 15 words, Sabine was able to reduce the echo effect and improve the acoustics of the hall. Sabine later went on to help design Boston’s Symphony Hall.

Since Sabine’s time, the field has continued to evolve. Now data analysts can use models to evaluate sound issues with existing halls and to simulate the design of new ones. One innovation has been the introduction of halls that have reconfigurable geometry and materials, which can be adjusted to make a hall optimal for different uses, such as music or speech.

Ironically, classic music halls, such as those built in the late 1800s, have remarkably good acoustics, while many halls built more recently do not. In the past, architects were constrained by the strength and stiffness of timber, which historically was used in the construction of such halls. More recently, the desire to accommodate more seats as well as the introduction of new ­building materials that have enabled architects to design concert halls of nearly any size and shape has increased the need for data-driven hall design.22

Architects are now trying to design newer halls to sound a lot like the halls of Boston and Vienna. Acoustic quality, hall capacity, and hall shape may not be mutually exclusive. By taking advantage of Big Data, architects may be able to deliver the sound quality of old while using the building materials and accommodating the seating requirements of today.

Big Data and Architecture

Concert halls aren’t the only area of architecture in which designers are employing Big Data. Architects are applying data-driven design to ­architecture more generally. As Sam Miller of LMN, a 100-person architecture firm points out, the old architectural design model was design, document, build, and repeat. It took years to learn lessons and an architect with 20 years of experience might only have seen a dozen such design cycles.

With data-driven approaches to architecture, architects have replaced this process with an iterative loop: model, simulate, analyze, synthesize, optimize, and repeat. Much as engine designers can use models to simulate engine performance, architects can now use models to simulate the physical act of building.23

According to Miller, his group can now run simulations on hundreds of designs in a matter of days and they can figure out which factors have the biggest impact. “Intuition,” says Miller, “plays a smaller role in the data-driven design process than it did in the analog process.” What’s more, the resulting buildings perform measurably better.

Architects don’t bill for research and design hours, but Miller says that the use of a data-driven approach has made such time investments worthwhile because it gives his firm a competitive advantage.

Big Data is also helping design greener buildings by putting to work data collected about energy and water savings. Architects, designers, and building managers can now evaluate benchmark data to determine how a particular building compares to other green buildings. The EPA24’s Portfolio Manager is one software tool that is enabling this approach. The Portfolio Manager is an interactive energy management tool that allows owners, managers, and investors to track and assess energy and water usage across all of the buildings in a portfolio.25

Sefaira offers web-based software that leverages deep physics expertise to provide design analysis, knowledge management, and decision support capabilities. With the company’s26 software, users can measure and optimize the energy, water, carbon, and financial benefits of different design strategies.

Data-Driven Design

In studying the design approaches of many different companies and the ways in which data is used, what’s clear is that data is being used more and more to inform design. But it is also clear that design, and the disruption that comes from making big changes, still relies on intuition, whether at Apple, Facebook, or at your own company.

As Brent Dykes—evangelist of customer analytics at the web analytics company Adobe/Omniture and author of the blog AnalyticsHero—notes, creative and data are often seen as being at odds with each other. Designers frequently view data as a barrier to creativity, rather than as an enabler of better design.27

In one famous instance, Douglas Bowman, a designer at Google, left the company, citing its oppressive, data-driven approach to design. Bowman described an instance in which a team at Google couldn’t decide between two shades of blue, so they tested 41 shades between each blue to determine which one performed the best.

Yet Bowman, now Creative Director at Twitter, didn’t fault Google for its approach to design, which he describes as reducing “each decision to a simple logic problem,”28 given the billions of dollars of value at stake. But he did describe data as “a crutch for every decision, paralyzing the company and preventing it from making any design decisions.”

In contrast, Dykes believes that the restrictions that data introduces increases creativity. Data can be incredibly helpful in determining whether a design change helps more people accomplish their tasks or results in higher conversions from trial to paid customers on a web site.

Data can help improve an existing design, but it can’t, as Facebook designer Adam Mosseri points out, present designers with a completely new design. It can improve a web site, but it can’t yet create a whole new site from scratch if that’s what is required. Put another way, when it comes to design, data may be more helpful in reaching a local maximum than a global one.

Data can also tell you when a design simply isn’t working. As serial ­entrepreneur and Stanford University lecturer Steve Blank once said to an entrepreneur who was getting his advice, “look at the data.” Blank was highlighting that the entrepreneur’s thesis simply wasn’t bearing out.

What’s also clear across many different areas of design, from games to cars to buildings, is that the process of design itself is changing. The cycle of creating a design and testing it is becoming a lot shorter due to the use of Big Data resources.

The process of getting data on an existing design and figuring out what’s wrong or how to incrementally improve it is also happening much faster, both online and offline. Low-cost data collection and computing resources are playing a big role in making the process of design, testing, and redesign a lot faster. That, in turn, is enabling people to have not only their designs, but the design processes themselves, informed by design.

Big Data in Better Web Site Design

As compelling as all that is, however, many of us will never get to design a smartphone, a car, or a building. But creating a web site is something nearly anyone can do. And millions of people do.

As of September 2014, there were more than one billion web sites online.29Web sites aren’t just the purview of big companies. Small business owners and individuals have built more than 50 million web sites using a free web site design tool called Wix.30

While web analytics—the tools used to track site visits, test whether one version of a web page works better than another, and measure conversions of visitors to customers—has come a long way in the last decade, in terms of becoming data-driven, web design itself has not progressed much in the same period of time, says Dykes.

Ironically, the web is one of the easiest forms of design to measure. Every page, button, and graphic can be instrumented. Designers and marketers can evaluate not just on-site design but the impact of advertising, other sites a user has visited, and numerous other off-site factors.

There are lots of web analytics tools available, but many of those tools require heavy analytics expertise or are primarily targeted at technologists, not marketers. Solutions like Adobe Test&Target (formerly Omniture Test&Target) and Google Analytics Content Experiments provide the ability to test different designs, but still require technical expertise. More recently introduced offerings, such as Optimizely, hold the promise of making, creating, and running site optimization tests—known as A/B tests for the way in which they evaluate two variants of the same design—a lot easier.

What’s more, at large companies, making changes to a company’s web site remains a time-consuming and difficult process, one that involves ­working with relatively inflexible content management systems and quite often, IT departments that already have too much work to do. Thus, while experimenting with new designs, graphics, layouts, and the like should be easy to do, it’s still quite difficult. A web site overhaul is usually a major project, not a simple change.

Many content management systems rely on a single template for an entire site and while they make it easy to add, edit, or remove individual pages, create a blog post, or add a whitepaper, changing the actual site design is hard. Typically there’s no integrated, easy way to tweak a layout or test different design approaches. Such changes usually require the involvement of a web developer.

These kinds of systems frequently lack built-in staging capabilities. A major change is either live or it isn’t. And unlike platforms like Facebook, which has the ability to roll out changes to a subset of users and get feedback on what’s working and what’s not, most content management systems have no such capability. Facebook can roll out changes to a subset of users on a very finely targeted basis—to a specific percentage of its user base, males or females only, people with or without certain affiliations, members of certain groups, or based on other characteristics.

This makes changes and new features relatively low risk. In contrast, major changes to most corporate web sites are relatively high risk due to the inability to roll out changes to a subset of a site’s visitors. What’s more, most content management systems use pages and posts as the fundamental building blocks of design. The other design elements are typically static. In contrast, a site like Facebook may have many modules that make up a particular page or the site as a whole, making any individual module easier to change.

Moreover, marketing executives are often concerned about losing inbound links that reference a particular piece of content. As a result, marketing ­executives and IT managers alike are loath to make and test such changes on a frequent basis.

The good news is that with the advent of Big Data tools for design and ­analysis, there are lots of resources available to help marketers understand how to design their sites for better conversion. The lack of data-driven design when it comes to web sites may have more to do with the limitations of yesterday’s underlying content management systems and the restrictions they impose than with the point tools available to inform marketers about what they need to fix. As data-driven optimization tools continue to show users the power of being able to tweak, analyze, and tune their sites, more users will demand better capabilities built into their content management systems, which will cause such systems to become more data-driven—or will result in the emergence of new systems that are built with data-driven design in mind from the outset.

Web site designers and marketers recognize that they need to change. Static web sites no longer garner as much traffic in Google. Fresh, current web sites that act a lot more like news sites get indexed better in Google and are more likely to have content shared on social media such as Twitter, Facebook, LinkedIn, blogs, and news sites.

Newer sites, and modern marketing departments, are a lot more focused on creating and delivering content that appeals to actual prospects, rather than just to Google’s web site crawlers. In large part, this is due to Google’s recent algorithm changes, which have placed more emphasis on useful, current, and authoritative content.

Going forward, it will no longer be enough to have a poor landing page for a customer to land on. Web sites and mobile applications will need to be visually appealing, informative, and specifically designed to draw prospects in and convert them into customers—and then retain them for the long term.

As social media and search change to become more focused on content that’s interesting and relevant to human beings, marketers and web site designers will need to change too. They’ll need to place more emphasis on data-driven design to create web sites and mobile applications, as well as content that appeals to human beings, not machines. Both on and off the web, we will evolve to create designs and implement design approaches informed by data. Of course, the design tools we use will evolve as well.

In the years ahead, data will continue to become a more integral part of product design, for both digital and physical products. A/B testing will become mainstream as tools for testing different design variants continue to become more automated and easier to use. Expect products to be built in more modular ways so that newer components can easily replace older ones as data shows what works and what doesn’t. Continued reduction in the cost of new 3D printing technologies will also make it easier to develop customized products based on what the data indicates.

____________________

1http://www.time.com/time/nation/article/0,8599,1532225,00.html

2http://uxweek.com/2010/videos/video-2010-adam-mosseri

3http://www.ojr.org/ojr/stories/070312ruel/

4http://money.cnn.com/galleries/2008/fortune/0803/gallery.jobsqna.fortune/3.html

5http://www.pragmaticmarketing.com//resources/you-cant-innovate-like-apple?p=1

6This myth has been dispelled, at least to some extent, by documents made public as a result of the Apple-Samsung court case, citing a recent market research study the company conducted; see http://blogs.wsj.com/digits/2012/07/26/turns-out-apple-conducts-market-research-after-all/.

7http://kaleidoscope.kontagent.com/2012/04/26/jogonuat-ceo-on-using-data-driven-game-design-to-acquire-high-value-players/

8http://www.cis.cornell.edu/courses/cis3000/2011sp/lectures/12-DataDriven.pdf

9http://ai.eecs.umich.edu/soar/Classes/494/talks/Schumaker.pdf

10http://www.gamesindustry.biz/articles/2012-08-06-zyngas-high-speed-data-driven-design-vs-console-development

11http://www.1up.com/news/defense-zynga-metrics-driven-game-design

12http://www.zdnet.com/fords-big-data-chief-sees-massive-possibilities-but-the-tools-need-work-7000000322/

13Mulally retired as CEO in June 2014.

14http://blogs.wsj.com/cio/2012/06/20/ford-gets-smarter-about-marketing-and-design/

15http://blogs.wsj.com/cio/2012/04/25/ratings-upgrade-vindicates-fords-focus-on-tech/

16http://www.sae.org/mags/sve/11523/

17http://www.ornl.gov/info/ornlreview/v30n3-4/engine.htm

18http://blogs.cio.com/business-intelligence/16657/novel-encounter-big-data

19http://lib.tkk.fi/Dipl/2011/urn100513.pdf

20http://en.wikipedia.org/wiki/Wallace_Clement_Sabine

21http://www.aps.org/publications/apsnews/201101/physicshistory.cfm

22http://www.angelfire.com/music2/davidbundler/acoustics.html

23http://www.metropolismag.com/pov/20120410/advancing-a-data-driven-approach-to-architecture

24http://www.greenbiz.com/blog/2012/05/29/data-driven-results-qa-usgbcs-rick-fedrizzi

25http://www.energystar.gov/index.cfm?c=evaluate_performance.bus_portfoliomanager

26http://venturebeat.com/2012/04/10/data-driven-green-building-design-nets-sefaira-10-8-million/

27http://www.analyticshero.com/2012/12/04/data-driven-design-dare-to-wield-the-sword-of-data-part-i/

28http://stopdesign.com/archive/2009/03/20/goodbye-google.html

29http://www.internetlivestats.com/total-number-of-websites/

30http://www.wix.com/blog/2014/06/flaunting-50m-users-with-50-kick-ass-wix-sites/