Monday, 23 December 2013

How data analytics make Red Sox fans happier


For any baseball team, making the World Series is a pretty dependable way to boost revenue. But when the Boston Red Sox seemed a long fly ball from this goal, the team still had something else going for it other than slugger David Ortiz: big data analytics. Ryan Scafidi, manager of financial planning and operations for the Red Sox, says, ‘Not only are we driving bigger profits, we are better able to monitor, manage and reduce operating costs.’ The Sox have implemented several cloud-based technologies from providers such as Host Analytics, MicroStrategy and Microsoft Dynamics to slice and dice data on ticket sales, as well as revenue from merchandise, food and parking. The tools also help the company reduce expenses, by pinpointing how many ticket takers, ushers and security personnel it will need for a particular game or game series, based on past experience. The World Series champs access voluminous structured and unstructured game data on the weather, the opposing team, the day and time of the week and various pre-game promotions. Algorithms let then the team forecast how best to allocate resources based on expected fluctuations in demand. When data points to ‘Dollar Beard Night’ One unexpected benefit of using cloud-based technology: marketing can go to work building innovative promotions designed to smooth out the ups and downs or drive ancillary sales from concessions—the software allows the team to measure these cross-channel effects. One example from last year, which got fans through the ticket gates, was Dollar Beard Night. Anyone who showed up at Fenway Park with a beard—real or fake—got in for a dollar to that night’s game against the Baltimore Orioles. “We had a chunk of tickets that hadn’t been bought so we decided to offer them at a buck apiece and then measure the revenue this generated for the concessions,” says Ryan Scafidi, manager of financial planning and operations. “We had another promotion—Kids Eat Free—where we measured ticket sales revenue. We then make this data comparable to other data we’ve amassed and crunch it into insight.” Recovering from a frustrating season In this regard, the Red Sox are as sharp in the back office as they are on the field. Nathaniel Rowe, research analyst, enterprise data management, at Aberdeen Group, says there is significant economic value in what the club is doing, data-wise. “Top-performing companies are leveraging data to embrace new ideas, seize opportunities quickly and solve pain points,” Rowe explains. “Data is the new oil, and it reflects how valuable information is to a business in this day and age.” By divining ways to better understand customer behaviors, preferences and sentiments, organizations are able to glean what they like and dislike, he adds. “When customers are happy, they come back,” Rowe says. They certainly came back to Fenway Park last season, although early on there were plenty of empty seats. The prior season, the Red Sox were mired in a mess chalked up to former manager Bobby Valentine, who generated more than his share of fan frustration for the team’s poor performance and some questionable decisions. “The bad publicity didn’t help us as the season began,” Scafidi acknowledges. “Our renewal season tickets were down, making it all the more important that we leveraged the data metrics to drive profitable business growth.” Creating digital fan profiles That’s all in the past now, he says. “We’re already capturing and analyzing financial, marketing and demographic data from this past season to make projections on what 2014 will look like,” he says. “Obviously, after winning the Series, we’re taking orders by the minute, but we’re still `pounding the streets’ with the knowledge of who our target audience is to offer the appropriate packages to them.” Among the insights gleaned from last season’s data is a digital profile of fans who prefer games on the weekend or day games versus those who prefer afternoon or evening, how people get to the ballpark when specific teams are visiting, and the games at which families are more likely to attend. As for the return on the investment in the technology tools, Scafidi says the, “The benefits far exceed the investment. Not only are we driving bigger profits, we are better able to monitor, manage and reduce operating costs.” While he won’t divulge how much money the team has saved or earned as a result, the Sox are confident about where they’re headed. In other words, don’t shave the beard

An unsung hero of Caterham F1 team


The Caterham F1 team’s performance analysts look at the data coming from the car at the Circuit of the Americas in Austin, Texas. Credit: Dell If you had to make a list of all the environments where you’d rather not use your laptop, it’s a fair bet it would look something like this: • Really hot places • Really cold places • Very dusty or sandy locations • Anywhere near excessive vibration The problem if you do any sort of engineering job for the Caterham F1 team at a grand prix is that this list pretty much covers every single location of the 19 races that the Formula 1TM calendar visits throughout the year. Walk through Caterham’s Leafield factory in the United Kingdom or round the team’s garage at a Grand Prix, and you will see a familiar piece of equipment. Mostly black, fairly sturdy in size and usually being inspected by a very clever engineer, it is the Dell Precision mobile workstation. It is not an exaggeration to say that this piece of equipment is one of the unsung heroes of this particular Formula 1TM team. Graphics and design The Precision’s ability to be able to deal with large CAD files while offering state-of-the-art graphical performance makes it an ideal candidate to be used in the design stages of the car. “Normally I’ll use mine for looking at lightweight CAD files of the whole car,” says Lewis Butler, the team’s chief engineer. “It’s handy because if there are any issues with how parts fit together on any new updates we’ve sent to the track, for example, I can be at the factory and speaking to someone at the circuit, quickly bring up the part in question on the Precision and look at it in quite a bit of detail in order to help them out.” Performance At the factory, the Precision enables other critical work to be performed once the car has been designed – notably in aerodynamics, the most important area of Formula 1 for teams to understand. “I’m looking at all sorts of things each day from CAD geometry to CFD post-processing, as well as track data and lots of wind tunnel results,” senior aerodynamicist Dominic Turner explains. “The Precision can do lots of different tasks at the same time, from the very heavy CAD graphics to all the data analysis.” For aerodynamicists like Turner, who spend time at the wind tunnel 20 miles away from the factory, having a computer with such power that’s portable is invaluable to their roles. The Caterham F1 Team’s performance analysts, who look at the data coming off the car at the track, also need to be able to deal with huge files. “We’re processing lots of data that gets sent to the factory from the track during a race weekend, as well as doing quite a lot of code development work,” explains junior performance analyst Alex Holyoake. “I’ve usually got between ten and 20 applications running at any one time, and a lot of those are pretty data-intensive, so the Precision is amazing for what we have to do.” Durability For all its handiness at the factory, it’s trackside where the Precision really comes into its own. Formula 1TM circuits are harsh environments with extreme temperatures and conditions. Anyone who’s ever been near an F1 car when it’s fired up will attest to the ferocity of the shuddering and decibels coming from it. The Precision’s hard drives contain no moving parts and vibrations coming from a Formula 1TM car are not a problem to deal with. Speak to anyone who works with a Precision on the Caterham F1 team and he or she will marvel at the speed with which it operates. Antony Smith, the team’s senior IT engineer, reveals, “The thing for the engineers is that they’ve got to be able to manipulate data and jump around on lots of files at once, and the Precision is perfect for that. They’ve got lots of memory and bandwidth and they’ve got good graphical performance. That’s what you need to operate the amount of software we’ve got at Caterham.”

Schools seek IT solutions for digital content


Teachers also want the ability to share digital content with colleagues. Credit: Wave Break Media The increasing use of digital content in the classroom is leading many school district IT managers to seek solutions that allow greater computing freedom without the bulk and cost that is sometimes associated with such investments. This search is more often ending with a purchase of a Dell PowerEdge VRTX, which integrates storage, servers and networking in a compact chassis small enough to fit under a desk. But while VRTX might be relatively small, its physical size shouldn’t at all suggest a lack of power. It includes a four-node cluster that can be deployed in under an hour, a unified system-management console to ease administration and the Intel Xenon E5-2600v2 processor, which offers enough efficiency and flexibility to enable high-performance computing. As more school systems move toward virtualization, Xenon-based servers contribute to the ability to consolidate virtualized networking applications. This allows IT to deliver increased throughput performance and latency for virtualized workloads. Capacity building Horry County Schools in South Carolina, a Dell server customer for 20 years, was preparing to conduct a server refresh cycle in about 35 of its 50 schools when district officials decided to invest in virtualization, which had been on their radar for some time. VRTX was attractive because “for not a huge investment we could get capacity of up to four servers and use VMware to build virtual servers on top of the physical,” which was a good way to deploy multiple servers out to the schools, said Thom Mountain, the district’s administrator of network and connectivity. “Any time you get into a blade system, there’s typically a larger upfront cost because you’re purchasing the system as well as the individual modules,” he said. “In this case, it was a real cost-effective solution.” Each school had three physical servers that ran virtual machines. Deploying VRTX, which has up to four blades, allowed Mountain to do some serious consolidation, saving on power consumption, cooling and space. New kinds of bookshelves The storage capacity in VRTX gives schools greater capacity to store more information locally, which is becoming more important as demand for digital content in the classroom grows. Horry County teachers rely on VRTX to share content with other teachers, while security personnel use other VRTX servers to store video surveillance files. Like Horry County Schools, the Mohawk Regional Information Center of Madison-Oneida Board of Cooperative Educational Services in New York was looking to virtualize its servers as its schools were running out of room for physical servers. “With one VRTX enclosure, a district can host dozens of virtual machines in a very small footprint,” said Joe DiFabio, a telecommunications specialist at Mohawk Regional. “Dell PowerEdge VRTX gives our school districts the ability to fail over between compute nodes in the same enclosure, which can really make a difference by ensuring uptime for key applications, including email, file services, student management systems, transportation systems and cafeteria systems.”

Mobile polling tactics not yet replacing the phone


Polling in the digital age is experiencing the biggest methodological overhaul in 50 years. Although landline telephone surveys remain a vibrant part of the business of polling voters and consumers, an increasing number of pollsters and market researchers are relying on mobile, social media and other platforms to gauge public opinion. “Telephone surveys will be around for a number of years, but the opportunity afforded by digital technology is too good to go unused,” said Scott Keeter, director of survey research at The Pew Research Center in Washington, D.C. Scott Keeter of Pew Research Center says older polling techniques aren’t going away but “digital technology is too good to go unused.” This sea change has led to an uptick in the number and frequency of polls. Adopters of these methods believe that they are able to draw a clearer picture of public preferences in an election or on issues. Pollsters believe that this trend is only likely to increase as practitioners try to keep pace with skyrocketing Internet usage. A recent Pew Internet and American Life study found that 85 percent of Americans now use the Web. Meanwhile, the amount of market research money spent on internet surveys has been steadily increasing. Yet online polling has also raised questions about accuracy. That stems both from the ease with which people can conduct polls online, and a rush to release information before competitors can. “There’s a concern about the quality of polls as we enter this new phase and more people are entering the field,” said Doug Usher, managing partner at Purple Insights, the research division of communications organization Purple Strategies. Pluses and minuses Still, many market researchers and pollsters are continuing to evaluate different online resources. They are finding that each carries benefits and disadvantages. Polls that rely on phone users to text responses are in favor since it’s easy and fast for a respondent to participate, but the samples are limited because federal law prohibits predictive-dialing – that is, where one letter key represents combinations of letters or whole words. (Some pollsters outside the United States are using texting more extensively, Keeter says.) Twitter can create stronger engagement with small populations. Businesses rely on Twitter to conduct sentiment analyses about their products and services but most experts in public polling do not yet consider it an adequate substitute for polling the general public. There is also industry interest in the convergence of online surveys and mobile devices (smart phones and tablets), especially when respondents are queried on issues such as health and their media use. Different groups are experimenting with how to format the questionnaires for small screens, such as placing one question on each page. Some researchers are also re-examining when and how they ask questions. A few are using open-ended and interactive queries that give respondents some flexibility in answering, and generate an online dialogue. Others are conducting their polls outside weekday evenings and weekends when they have traditionally taken place. All these tactics are designed to gather more meaningful information than has traditionally been possible. The telephone remains relevant While many pollsters are increasingly of the belief that online methods have given them the ability to connect with groups and pinpoint sensitive information that was previously hard to reach, they continue to rely on telephone polling. The Pew Research Center’s Keeter said that his organization and others use a “mixed-mode approach” to capture the multiple methods in which people interact and communicate—online and offline. That approach covers mobile devices and tablets, text messaging and tweeting along with random calling and Interactive Voice Response (IVR) systems, which allows survey participants to interact with computer-automated questions. In a recent survey of the gay community, Pew recruited respondents by phone and mail to participate in the online survey. But Pew also gave computers and an Internet connection to individuals who lacked online access. It was expensive, said Keeter, but added that “for some kinds of content, online is better. People answer more honestly and in more detail.” Purple Strategies, based in Alexandria, Va., uses a mixture of automated landline telephone and online interviews of likely voters who primarily use cell phones. In a recent poll about Virginia gubernatorial candidates, Purple Strategies surveyed about two-thirds of respondents by landline phone and one-third online. Connecting to someone via a landline is about two times less expensive than calling them on a cell phone because Federal law does not allow organizations to robocall cell phones. Pollsters must dial cell phone numbers manually. Doug Usher says that combining resources enables pollsters to cover a wider amount of ground than might otherwise be possible. Such an eclectic approach can also help compensate for the shortfalls of a single method. “There’s no perfect method these days,” said Doug Usher. “New technology is transforming the landscape and the answer is to use multiple forms of polling.” Usher added that he was unsure what blend of resources would ultimately be the most user friendly and accurate. “But I do know the industry is doing the best it can to challenge themselves to find successful ways,” he said.

Strategies for application modernization success


It’s time to leave your legacy. We all know running legacy IT systems can be costly, inefficient and at times, frustrating. Right now, 70 percent of IT resources on the federal government side are dedicated to legacy management. We see similar numbers in the healthcare industry and private enterprise. Moreover, legacy systems don’t easily support the new technology we need, like cloud computing and mobile apps. This is why application modernization is front and center. Legacy modernization plans are a key initiative for organizations of all sizes and budgets, with good reason. We’re living in a new state of computing, and companies can wait no longer. But these projects aren’t easy; they are never without roadblocks and technical challenges. So where do we start? Upfront planning and investment rationalization can pave the way for a smoother project and a successful result. What’s the key to achieving your goals? First and foremost, it’s important understand the drivers that define project goals. Lower total cost of ownership (TCO) used to be the main catalyst for these projects, but there’s been a shift. Today, application modernization is also been driven by the need for new business requirements and new technology adoption. That’s why I think it’s important to communicate the benefits to all stakeholders through a well-vetted application rationalization plan. Moreover, we must demonstrate that legacy modernization is an ongoing process, not a one-time goal. What’s the best way to keep a project on schedule? Once expectations are set, it’s vital to develop a comprehensive, detailed schedule. Remember, these are complex projects; standard resources and processes do not apply. The biggest risk to any project is what’s not seen in the beginning. That’s why a detailed project plan with integration points throughout can address unforeseen issues. I also recommend setting up a monthly joint steering committee meeting. This high-level discussion will effectively knock out roadblocks and help keep the project on schedule. How can you avoid downtime during an implementation? That’s another great thing about today’s capabilities. Today’s application modernization plans be accomplished without freezing your data or applications. For instance, solutions like Dell’s ZeroIMPACT Migration services or application rehosting and database conversions can provide a seamless transition. With that in mind, every project needs a solid fallback plan — we should prepare for any possibility when it comes to data migration and the like. How do I find the right partner? The right application modernization services vendor actively involves relevant stakeholders upfront and throughout a project’s timeline. The right vendor has enterprise experience, is collaborative and can solve problems because they’ve seen them before. And most important: your vendor must understand your business needs. You can learn more about challenges and success factors or listen to a recent webinar featuring Brandon Edenfield, Dell’s executive director of Application Modernization Services. The bottom line is this: application modernization is inevitable. Success rides on proper planning and communication.

Tame the tangles with a desk-side data center


Messy wiring closets can increase IT maintenance costs. Credit: Spiceworks Picture this: You’re working in a branch office and the network goes down. You open up a closet door where the computer equipment is stored only to find a tower of boxes stacked precariously and a tangle of computer cables and wires coming at you. This is a typical scenario in remote and branch offices, workplace settings collectively known as ROBOs. Organizations rarely staff these places with dedicated IT support staff, leaving it up the employees to troubleshoot, taking them away from the tasks they were hired to perform. Remote offices also commonly have infrastructure that uses disparate, non-standard hardware, multiple systems management tools, lack of space to scale and poor security. While some organizations maintain tight controls to ensure that infrastructure and applications remain consistent across locations, most do not and IT ends up supporting a mishmash of hardware and software. Poor standardization makes solving offsite problems difficult, time-consuming and expensive. About 75 percent of IT budgets are spent on infrastructure and personnel costs needed to support ongoing operations, according to ATKearney. The most efficient IT departments spend less, freeing up more funding for upgrades and strategic technology that facilitates expansion and greater competitiveness. 175 pounds of data center The ideal solution is a so-called data center in a box, which combines servers, storage and networking into a single, easy-to-use system. That box must also be designed to fit seamlessly underneath or next to an employee’s desk and not require any special setup, power or cooling. Since IT is always looking to control costs, pricing is yet another consideration. Dell has addressed this with its high-performance PowerEdge VRTX, which removes the challenges and complexity of managing IT remotely. PowerEdge VRTX provides up to four servers that Dell can customize with Intel Xeon E5-2600v2 processors. It’s virtualization-ready and designed to work with Microsoft Hyper-V or VMware vSphere. It is also easy to deploy, weighs about 175 pounds and stands 19 inches tall by 12 inches wide, so it can be located almost anywhere in the office. It has simple built-in monitoring and management, including an integrated console with enterprise-class tools for onsite or remote management. There is almost no need to integrate cable servers to networking switches and then to external storage with PowerEdge VRTX. Everything is contained in a 5U tower or rackable chassis to deliver the redundancy and reliability required to meet the demands of today’s always-on environments. As businesses look to reduce the complexity of IT and better serve their branch and remote offices, they need systems like PowerEdge VRTX that can be quickly deployed, managed with ease and prepared to scale with today’s business requirements.

Small business owners: Beware of the big data dump


Big data is a double-edged sword. The ability to monitor so many variables and business results can lead to higher ROI, faster sales and other benefits. Yet when small business owners monitor anything and everything, they risk losing sight of what matters and what’s really within their capabilities. ‘Managers should think beyond their silos and KPIs to company goals,’ says Laura Smous of The Resumator. Time, capital, and people are a company’s most valuable assets. A successful analytical strategy requires optimization of all three. To determine the ideal workflow, organizations need to start with their customers and look carefully at their processes for connecting with them. “Listen to everything and then start to figure out what the right questions are,” explains Mark Panay, marketing director and co-founder at Contactzilla, a CRM platform. “Leverage the insight of your customer service and support teams. Don’t use your support team as techies, use them as part of marketing.” Client-facing teams are at the front lines and know customers’ challenges. These team members can assess, qualitatively, what business milestones are most important. An analytics team should then translate these anecdotal insights into quantitative benchmarks using statistical tools for trend analysis. Uncovering the nuances Sometimes, important details aren’t quantifiable. Organizations should use proxies for measurement instead. Prior to running Contactzilla, Panay encountered the challenge of measuring intangible goals for his mobile content and marketing firm’s initiatives to drive visibility. “Measuring this was nigh on impossible but in conjunction with the other things that we were doing, we were building our brand, creating statements of intent and attracting the kind of people that we wanted to work with,” Panay says. He realized, however, that focus began with one key question. “So why did we do it, and what did we get from it?” In this case, monetary ROI was only tangential to the impact of his efforts. “We learned that aligning with certain types of clients even at a loss helped us acquire other clients that wanted to be associated with them and use the same principles as the ‘cool kids,’” says Panay. “The direct monetary ROI is irrelevant in this case as it helped make the sale on multiple clients further down the chain.” Keeping a bird’s eye view Focus isn’t everything. Marketers need to connect their analytical strategies to a bigger picture. “Managers should think beyond their silos and KPIs to company goals and which systems best track the full customer lifecycle, from awareness to subscription to product engagement to support,” explains Laura Smous, content strategy lead at The Resumator, a tool for recruiting and tracking applicants. Marketing and analytics teams need to act cross-functionally to pursue a unified vision. “Managers should think beyond their silos and KPIs to company goals and which systems best track the full customer lifecycle, from awareness to subscription to product engagement to support,” says Smous. Marketers need to investigate what they don’t know in addition to what they do. “The biggest breakdown for me is when visitors go from unknown to known,” says Smous. Tools and technology can help business leaders bridge this gap and trace the connection.”Track the full journey to becoming a customer,” emphasizes Smous. Connecting with individuals At the end of the day, marketing is about people. Automation may be efficient, but it is equally important for business leaders to balance their strategies with human-to-human connections. “Automated personalization doesn’t have the same kind of impact as person-to-person relationships,” says Panay. “However we find that the person-to-person has a much greater benefit to the bottom line, despite its inability to scale automatically.” Panay recommends diving deep into customer conversations. Position client-facing support teams at the front lines of this research process. Develop anecdotal reports for benchmarks and situations that aren’t measurable. “For example, a customer recently asked the best way to solve a problem that clearly Contactzilla couldn’t solve,” says Panay. “We responded with a step-by-step tutorial on how to achieve what they wanted with another product. They couldn’t believe it.” Saving time means prioritizing connections as the core of marketing. Quantitative or qualitative, these stories will drive actionable insights and growth.