It seems like it wasn’t too long ago that an average 3-year-old could easily beat a computer at determining someone’s gender by looking at a picture. Now, these cameras are getting better at determining gender as well as age of the shoppers in the store.
This advance really allows in-store retailers to catch-up with their on-line counterparts in terms of understanding customers: where do they spend their time, how long do they stay, what do they look at, and what do they ultimately buy?
The article provides an example of a retailer who determined when the number of shoppers peaked (it wasn’t when sales peaked) and built their staff schedules around that to generate more sales.
There is a lot more retailers can do with this information. It will be interesting to see how this evolves over the next few years.
DC Water suppliers water to 2 million customers in the Washington DC area with several thousand miles of pipes (with an average age of over 75 years) and maintains nearly 9000 fire hydrants. Several years ago, everything was tracked on paper.
IBM and DC Water have recently published several videos and articles on how DC Water has applied descriptive, predictive, and prescriptive analytics to the system with positive results.
This is a nice case study in the use of analytics.
The descriptive analytics allowed DC Water to map the location of the fire hydrants for better maintenance. With extra sensors it also allowed them to better monitor water useage and look for anomalies
With aging pipes, the predictive analytics allowed them to predict failures before they happened.
And, the prescriptive analytics allowed DC Water to better route maintenance crews to fix trouble tickets increasing the productivity of the maintenance team while driving down fuel cost.
IBM published several videos on the solution>
This first video is a nice high-level overview of the solution:
The second video provides more details of the solution:
The third video highlights the predictive analytics solution:
IBM also published a written document of the solution, but I found the videos very well done.
The Economist’s Jan 19th survey on Offshoring and Outsourcing made an interesting point about the rising value of data to an organization. The article mentioned that the outsourcing movement grew by moving non-essential, repetitive jobs to 3rd party provides (mostly in India).:
Some activities that used to be considered peripheral to a company’s profits, such as data management, are now seen as essential, so they are less likely to be entrusted to a third-party supplier thousands of miles away.
and this quote from GM’s CIO:
“IT has become more pervasive in our business and we now consider it a big source of competitive advantage,”
I have seen some articles argue that data is starting to be viewed as a critical economic input, just like capital and labor. These quotes tend to highlight that. If data is important to you, you do not want to simply outsource that to the lowest bidder.
A recent BusinessWeek article discussed GE’s entry into the Analytics space. GE is starting to put sensors on it industrial machines and building the capability to analyze that data.
As an example, the article’s opening paragraph mentions that a jet engine can collect a terabyte of data on one cross-country flight:
On Nov. 29, Jeff Immelt pulled out the really big iron.General Electric’s (GE) chief executive climbed up to take the stage at a modified film studio in San Francisco and stood next to a 6.87-ton jet engine built by his company. Inside this mass of twisted metal—Immelt told the spectators at the company’s Minds and Machines event—were 20 sensors that monitor the engine’s performance, generating part of the roughly 1 terabyte of information produced on a one-way, cross-country flight. In the years ahead, GE plans to analyze this information as it’s never been analyzed before in a quest to build smarter machines and more lucrative services that it can sell to customers.
The prize? With a 1% improvement, GE claims it can save its customers billions of dollars over a 15-year period.
This definitely fits the trend of firms putting sensors on various types of equipment to help drive improvements.
Northwestern’s Fall 2012 magazine features various researchers who are working in the area of Big Data. The need to analyze Big Data is a reason that NU started the Masters in Analytics program:
“I’m getting calls from firms that see the value in big data, but they don’t know how to extract it,” says analytics expert Diego Klabjan, professor of industrial engineering and management sciences. “It’s definitely a very, very hot area. Everyone’s looking for expertise. We’ve had tremendous interest from companies. These days every company needs analytics. They need to hire a workforce that is capable of analyzing data.”
To that end, McCormick recently developed a master of science program in analytics. The inaugural class of the 15-month program is learning data warehousing techniques, the science behind analytics, and the business aspects of analytics. Directed by Klabjan, the program has its own computing cluster to take on big-data problems, and students will each do a summer internship. They will learn to identify patterns and trends, interpret and gain insights from vast quantities of structured and unstructured data, and communicate their findings in business terms.
It is easy to find articles that mention analytics. It is harder to find one that gives some concrete examples. A recent article on Fab.com (a fast rising $140 million design retailer) does a nice job in one small paragraph of mentioned what they are doing with analytics:
WSJ: How are you using data to figure out what customers want?
Mr. Goldberg: We do a ton of customer segmentation, a ton of cohort analysis. We’re starting to get smart about putting certain products in front of people based on what they’ve looked at in the past or bought in the past. We’re already one of the leaders in utilizing data for e-commerce when it comes to understanding a wide range of products and how to merchandise them. You’ll see much more personalization and customization come out next year.
(For a good definition of cohort analysis, see here or here)
The print version of the Dec 2012 National Geographic magazine had a short article on San Francisco’s pilot program to dynamically change street parking space based on demand. (You don’t really expect to see an article on dynamic pricing and revenue management in National Geographic!)
The Wall Street Journal ran a front page story on how on-line retailers were changing prices in real time to match competitors. Since this is happening in real-time, the retailers need to have good systems for keeping track of competitors prices, and good algorithms for figuring out what to do.
Since this market moves a lot faster than the airline market, I’m guessing that the algorithms have to have a bit more game theory in their strategies than the airlines (what will competitor X do in real time if I change this price?).
To combat the price war, retailers are trying to offer unique items. Of course, this means that the algorithms will have to figure out when two products are close enough to be substitutes.
I’m guessing that we’ll only see more articles like this as dynamic pricing moves to more product categories.
I didn’t naturally think about big data being applied to the dairy industry. But, this isn’t the first place I’ve seen this industry referenced. Cows are expensive and produce a lot of milk. Keeping the herd healthy and productive is important. With new sensors, new tests, and the ability for dairy farmers to upload data for analysis the industry is a good target for big data efforts.