CMR INSIGHTS

 

Embracing Digital Transformation as-a-Service

by Jonathan Zhang and Hsiao-Wuen Hon

Embracing Digital Transformation as-a-Service
Businesses will find new opportunities in DTaaS software platforms.
  PDF

Digital transformation (henceforth abbreviated as “DT”) is on the minds of many organizations and promises to modernize firms by optimizing existing processes, enhancing customer experiences, and fostering innovation. Based on the 2018 study by the World Economic Forum and Accenture, DT will realize $100 trillion in value over the next decade.

Not only does DT affect businesses’ competitiveness and the consumer experience, but pervasive digitization can profoundly impact many facets of human society - from government services, healthcare, defense, communication, to the future of work and the very definitions of firms and industries. It has the potential to break down socio-economic and cultural barriers.

Although many organizations are investing in DT, few are achieving the envisioned results. In a 2018 McKinsey Global survey of over 1700 executives, 80% of the respondents reported that their organizations had begun DT initiatives in recent years, but only 14% mentioned that their efforts have made and sustained performance environment, and an alarmingly low rate of 3% reported complete success at sustaining the change.

Against this backdrop of DT’s vast promises and its current unsatisfactory performance, we are also witnessing a recent trend - the morphing of DT from the previous models of discrete consulting engagements and IT integrations to a service model combining multiple technology and service providers on industry vertical platforms offering continuous, seamless, end-to-end transformation with the ability to adapt to changing business environments.

We call this emerging model “Digital Transformation as-a-Service.” This model will democratize and accelerate digitization and digital transformation to all facets of society.

DTaaS not only provides flexible scalability, it also requires providers to more deeply interface with each other and integrate into the operations of the business to provide continuous adjustments, thereby increasing the success rate for sustained change.

Based on our experiences advising organizations on DT initiatives and discussions with diverse stakeholders, we offer guidance for managing and benefiting from DT in the emerging as-a-Service environment to firms and to technology and service providers. In short:

  • Firms should look beyond technology – they need to possess a customer-centric, data-centric, experimental, and adaptive mindset.

  • Technology and service providers need to have a closer alignment among upstream and downstream services and establish industry-vertical platforms to facilitate data sharing and seamless integration across different business functions.

From DT to DTaaS

DTaaS emerged in the same way the software industry once evolved from the product-based model to SaaS – driven by technological advancements and demands from the market place.

Once upon a time, firms would first decide to improve a particular operational process, say HR, enterprise resource allocation, or sales. They would then purchase technology such as ERP software and hire service providers such as management consultants to implement the change. However, the improvements, even when successful, were often isolated and rarely involved organizational changes. Often, once the technology was purchased and the consultants left, firms were left stranded without a coherent and long-term plan, and millions of dollars of systems and data were left under-utilized.

DT is complex – it involves many stakeholders, relies heavily on IT integration with different business units from back-end to front-end, and required continued monitoring and adjustment for sustained success. What customers experience is the result of multiple, interrelated business processes. These interrelationships and customer requirements are all evolving rapidly.

These factors make it difficult to isolate an area for improvement. Accordingly, the previous management consulting approaches to DT are becoming insufficient. Sensing these market shifts, organizations are increasingly asking technology and service providers to provide end-to-end solutions with DT that encompass seamless integration from strategic planning, cultural transformation, to systems implementation and iterative tuning.

With the advances in cloud for data storage and security, artificial intelligence (AI) for process automation and business insights, and the internet of things (IoT) and mobile technologies that enable human-centered computing, DTaaS is no longer a one-time engagement; it is a continued transformation journey that can offer the following benefits:

  • Seamlessly merges business services, technology, and hardware providers through industry-vertical platforms. These providers now become partners for continued problem-solving and process improvement.

  • Cost-effectiveness and scalability – firms can get started on transformation quickly without substantial upfront investment in infrastructure and maintenance – infrastructure that will likely change very quickly in today’s environment. Resource needs can be scaled easily via elastic cloud computing.

  • Secure, cloud-based computing aligns well with the flexibility and mobility of the current workforce.

Guidance for Firms

So, how do firms succeed in the DTaaS environment?

First and foremost, firms must have a customer-centric culture that should be understood by all functional and business units, regardless of whether they are operating in existing processes or are driving innovation. Customer-centricity sets the scope for performance metrics and data collection.

Second, firms must have a data-centric culture. DT relies on constant and detailed data flow, and firms need to embrace the transformative power of data and set up the data pipeline infrastructure that includes data collection, cleansing, accessing, ming, experimenting, reporting, and visualization.

With a customer-centric culture in place, data collection efforts should be centered around the customer journey, from acquisition, retention and repurchase, expansion, to post-purchase behaviors such as word of mouth.

Some of the customer-based outcomes can include acquisition sales time, conversion rates, purchase frequency and amount, types of product purchased, browsing behavior, browsing pattern, device usage, referral, complaints, inquiries, and service encounters. Technologies such as video, GPS, voice, facial recognition, and sensors can collect insightful data beyond transactions such as usage-patterns and emotions. The database can then be supplemented with modern market research approaches such as the extraction of customer sentiments in the social sphere.

Although service providers in DTaaS verticals will have some industry-domain knowledge, nobody knows the business better than the firm itself. Therefore, firms need to set up and empower a data science organization. We noticed that firms that have invested in a CDO (Chief Data Officer) and a data science team have a much higher success rate in DT than those who have not. Without an in-house data science team, DT is often treated as a one-time transaction and is very difficult to take root in the organization.

We suggest staffing at least the following leadership roles to put a data infrastructure in place:

  • Chief Data Officer (or a variant of the title in the C-suite such as Chief DT Officer). This person will work closely with the CEO and other C-level colleagues to ensure DT happens at the enterprise-level.

  • Head of business intelligence and artificial intelligence. This team has the ability to understand large blocks of data and to translate how to use data and technologies to transform the business in a customer-centric way.

  • DT business leads. They are evangelists for DT within the organization who enable different parts of the business to see improvements from digital efforts, such as increased conversion from the deployment of chatbots or higher repurchase from marketing personalization. These evangelists can then share these success stories with the organization, get organizational buy-in from other business units and partners, thus creating a virtuous digital culture.

Third, top management needs to establish bold enterprise-level strategies (instead of at the functional and business-unit level) that are committed to build new businesses and focus on key performance outcomes.

For example, customer-centric performance outcomes include shortened acquisition cycle, reduced churn, improved cross-selling, or improved satisfaction and referral. Here, we recommend having both short-term and long-term metrics – while increased customer engagement and satisfaction can be immediately detected through a revamped website, changes in customer lifetime value will take months to years to show up.

Here, a big data approach can identify correlation and causality among different business units that were previously undiscovered and can enable top managers to mobilize the various units to focus on these customer-centric outcomes in innovative ways.

Finally, getting the strategy right is important, but as the management theorist Peter Drucker once said, “Culture eats strategy for breakfast.” Culture is the values, customs, beliefs, and symbolic practices that men and women live and breathe day (Eagleton 2016).

Firms should possess an experimental, agile, and risk-taking culture, and their leaders should embody and promote what the psychologist Carol Dweck referred to as a “growth mindset” (Dweck 2008). The future can never be fully known in a dynamic market (Slater and Narver 1998). To understand the evolving needs of digital customers in the future, firms need to conduct market experiments and learn from the results of these experiments.

Defining multi-year investment requirements and performance targets make little sense as the landscapes are increasingly unpredictable. DT often requires weekly adjustments, and previous hypotheses might be tweaked or overhauled as the results of data streams. One benefit of DTaaS is that firms can flexibly try many things without committing to an entire infrastructure. So, experiment.

Relatedly, DT’s performance outcomes can be volatile. Market experiments are inherently risky and always involve some failure. The failure could result from mainstream consumer’s lack of knowledge about how their future needs may be better satisfied with digital technologies, or their inertia to change behaviors. Successful DT is more likely for firms with cultures that are risk-taking and risk-tolerant. To overcome fears of failure and institutional inertia, employees should be rewarded for taking risks of an appropriate level.

Guidance for Tech and Service Providers

In the past, service providers worked on areas of improvement independently and sequentially. The era of DTaaS calls for a much closer integration of software, hardware, business services, and consulting in order to create digital customers and digital experiences.

The era of DTaaS calls for a much closer integration of software, hardware, business services, and consulting in order to create digital customers and digital experiences.

DT creates digital customers, which are customers that provide in-use information, defined as information acquired in real-time as the product or service is consumed. (Ramaswamy and Ozcan 2018). Insight from in-use information can then create additional values through interactions with customers in the forms of personalized and real-time services, or improvements on future product iterations. Value is thus co-created with customers, and the value co-creation can range all the way from production to consumption stages.

For example, GE can provide real-time guidance for pilots to adapt to their altitude and speed to optimize fuel consumption through applications of analytics to the in-use data stream from its engines and weather information, essentially providing performance customization of its jet engines for every flight. Similarly, Caterpillar and Siemens can leverage in-use data to predict individual component failures and broader systemic failures before they happen.

Furthermore, customer in-use data can be shared with other businesses in the eco-system in real-time to enhance the customer experience. Just like digital-native companies like Uber can share the rider’s destination with hotel check-ins, safety alerts, and restaurants nearby to offer enhanced customer value in the travel domain, so can many traditional businesses if they apply the same thinking.

Lightbulbs with sensors can generate in-use information on motion patterns, which can then be used to enhance security systems (smart homes), efficient storage patterns (smart warehouses), or detect abnormal activities such as gunshots and alert public safety systems (smart cities).

Mattresses embedded with IoT sensors can leverage in-use information to provide insights on sleeping patterns and advise improved sleeping habits. They can then share the in-use information with other smart objects such as temperature control, music, and light, to create an individualized and dynamically adjusted sleeping environment depending on the stages and quality of sleep. The biometric data during sleep can then be shared with the user’s physician to assess potential health issues.

In industrial domains, Caterpillar can share in-use information on digging activities with concrete pourers and other machinery at construction sites to synchronize activities and reduce construction costs.

As in-use information is useless after the use, sharing this data does not carry the same competitive risk as sharing customer data in a traditional sense such as past purchase behaviors.

Thus, the interdependent value co-creation from production to consumption stages through sharing of in-use data calls for the creation of digital platforms in each industry vertical, with the constant focus on enhancing the customer experience, regardless whether the industry is transportation, sleeping, retail, construction, entertainment, manufacturing, or government.

The data sharing and close collaboration can provide benefits to all parties on the platform, as the value of the industry platform grows. Therefore, not only should data API be established to enable data flow, providers should appoint staff to liaison with other partner organizations, akin to a “human API”, to ensure the digital experiences flow well. Protocols and regulations need to be updated regarding security and privacy, in order to facilitate the sharing of information and knowledge.

Sometimes it is also important to look at existing partner relationships with a fresh set of eyes. Partnership strategies that failed in the past might be useful in the future. Technology changes, the environment change, and people change – it is a mistake to write off any previous relationship as a lost cause.

Cases of Digital Transformation

As the below cases illustrate, these companies of different sizes have all embodied the above DT best practices by focusing on tangible outcomes via an assemblage of appropriate technologies. Accordingly, the various technology, hardware, and software providers have aligned in the industry ecosystem to enable coherent DT implementations, resulting in enhanced customer experiences and firm outcomes.

Case 1. Royal Caribbean: Using video and computer vision to streamline boarding, enhance on-board guest experience, and reduce waste

The 50-year old cruise company Royal Caribbean started an “Imagineering Department” to integrate digital interactive design throughout the customer journey to streamline the boarding process and enhance on-board experiences through a mix of digital and physical realms.

A large cruise ship is a floating city that combines a hotel, entertainment, and food and beverage, along with complicated logistics. Royal Caribbean believed that the right use of technology, data, and AI will be a key differentiator in the industry in the future.

Starting with the check-in, the team envisioned an “invisible experience,” fueled by customer-submitted cellphone selfies and pre-checks.

Working with a leading tech provider’s commercial software engineering (CSE) team, the provider offered resources that Royal Caribbean didn’t have access to in-house such as cognitive services and cloud computing capability that allowed face recognition in a millisecond. Utilizing an open source mindset and experimenting with different hardware providers, the CSE team worked alongside Royal Caribbean every step of the way to develop a solution that was both “invisible” and offered guidance feedback.

In addition to the video technology, cloud, video analytics, AI, and BI dashboards are used to manage guest experiences daily.

For example, the closed-circuit cameras installed in every public area across Royal Caribbean’s fleet became an endless well for computer vision to analyze venues, retail, flow, and movements. Using AI to anonymously track bodies rather than faces, these cameras observe how long a person stays in a certain area, traffic paths and population density — all important yet nearly invisible factors in streamlining the guest experience. Which retail displays get the best engagement? When should the “roving piano” relocate itself? How long after a guest is seated at a restaurant do they receive their meal?

The system of cameras and the associated AI algorithm can also be trained to produce back-end logistics and sustainability innovations. For example, the system can be trained to watch food consumption so the kitchens would reduce waste by not over-preparing food. Other logistics can be used to optimize the movement of goods, restocking frequency, and recycling practices.

On the reimagined consumer experience, the “Sky Pad”, a bungee experience on the top deck of one of allows customers to “jump” from planet to planet via virtual reality headset. Other futuristic ideas include “Bring Me a Drink,” where a customer can be anywhere on the ship, push a cellphone button and receive a beverage from the nearest bar and “VR Dining,” which taps into the science of nerve centers in the brain to create visuals that enhance the taste of your food.

Case 2. Fruit of the Loom: Using analytics to predict consumer demand with weather changes and establish a nimble inventory system

Using analytics on weather forecast and store demand data across the nation, the 166-year-old clothing brand Fruit of the Loom found that when the weather forecast shows an autumn temperature decrease of 12 degrees or more occurring within six days, many U.S. consumers consider that a cue to buy more fleece.

The actual temperature doesn’t matter, it is the size of the decrease that counts. In other words, winter is relative, and ‘cold’ for somebody who lives in Florida is different from ‘cold’ for somebody in New York. Nevertheless, if the weather is about to cool by 12 degrees, people in both Florida and New York see that as cold – and time to go to the store for fleece.

The finding gives Fruit of the Loom predictive analytics to make its supply chain more nimble, ensuring its retail partners have full stocks of fleece products ahead of autumn cold spells, thus boosting sales.

“Instead of having a cold-weather event happen, then trying to very quickly ship in inventory, we’ve been able to be more proactive by having that inventory in place in the store days before,” says the senior manager of data science at Fruit of the Loom. The company then worked with technologists to dig below the initial correlation between cold weather to higher sales of winter wear. They sought to answer two consumer-psychology questions: 1) How many days out must cold weather appear in a forecast; and 2) how far must the weather change before consumers respond?

To address these questions and build decision support systems, the team set out to build a business intelligence dashboard. They collected store-level, inventory metrics from a Fruit of the Loom national retail partner. They compared those numbers with 10-day temperature forecasts, supplied by AccuWeather, covering cities served by the retailer, with the analysis focused on October and November.

The data scientists then used a relational database to process the vast datasets and cloud-based machine learning to model the data and pinpoint the temperature change that prompts consumers to act: 12 degrees within six days.

In those cases, Fruit of the Loom can now notify a retailer about predicted temperature dips and any stores in that area with low stocks of fleece items, arranging for new shipments to arrive before the cold arrives.

The BI dashboard then produces a heat map of how responsive demands are to weather changes for 30 U.S. states. What they found is that most responsive” markets included Washington, D.C., Charlotte, Atlanta, Tampa, cities in Texas, Los Angeles, Phoenix, and Seattle.

The northern part of the U.S. still feels an impact but the degree is much smaller, which makes sense – if you are in a typically cold state, you have enough clothes in storage, and if it gets a little bit colder, you don’t care. People in Florida are more sensitive.

Furthermore, the findings revealed a marketing nuance: a weather drop is a free promotion, and promoting winter wear ahead of a 12-degree temperature dip is a waste of marketing resources. The data showed sales revenues for Fruit of the Loom fleece were nearly identical, with or without accompanying advertising, during the days just before temperatures were expected to plummet.

Case 3. Starbucks: Using cloud, sensors, and blockchain to form more personal connection with its customers

With over 100 million weekly customers, Starbucks is creating an even more personal, seamless customer experience in its stores by implementing an assemblage of technologies ranging from IoT sensors and cloud computing to blockchain.

Under the leadership of Starbucks’ executive vice president and chief technology officer, the team of technologists was engaged in innovations dedicated to the core Starbucks customer experience – “Everything that we do is centered around the customer connection in the store, the human connection, one person, one cup, one neighborhood at a time.”

Making recommendations more relevant with reinforcement learning

Starbucks has been using reinforcement learning technology — a type of machine learning in which a system learns to make decisions in complex, unpredictable environments based upon external feedback — to provide a more personalized experience for customers who use the Starbucks mobile app.

Within the app, customers receive tailor-made order suggestions generated via a reinforcement learning platform that is built and hosted in the cloud. Through this technology and the work of Starbucks data scientists, 16 million active Starbucks Rewards members now receive thoughtful recommendations from the app for food and drinks based on local store inventory, popular selections, weather, time of day, community preferences, and previous orders.

This personalization means that customers are more likely to get suggestions for items they will enjoy. For example, if a customer consistently orders dairy-free beverages, the platform can infer a non-dairy preference, steer clear of recommending items containing dairy, and suggest dairy-free food and drinks.

In essence, reinforcement learning allows the app to get to know each customer better. While the recommendations are driven by machines, the end goal is personal interaction and connection.

Delivering smooth and hassle-free coffee experiences through IoT

Each Starbucks store has more than a dozen pieces of equipment, from coffee machines to grinders and blenders, that must be operational around 16 hours a day. A glitch in any of those devices can mean service calls that rack up repair costs. More significantly, equipment problems can potentially interfere with Starbucks’ primary goal of providing a consistently high-quality customer experience.

To reduce disruptions to that experience and securely connect its devices in the cloud, Starbucks is partnering with a tech partner to deploy a secured application platform, designed to secure the coming wave of connected IoT devices across its store equipment.

The IoT-enabled machines collect more than a dozen data points for every shot of espresso pulled, from the type of beans used to the coffee’s temperature and water quality, generating more than 5 megabytes of data in an eight-hour shift. The tech partner then worked with Starbucks to develop an external device to connect the company’s various pieces of equipment to the platform in order to securely aggregate data and proactively identify problems with the machines. Due to these cloud-connected devices, the staff can spend more time hand-crafting beverages and interfacing with customers and less time on machine maintenance.

The cloud-connected devices also enabled Starbucks to send new coffee recipes directly to machines, which it has previously done by manually delivering the recipes to stores via thumb drive multiple times a year. Now the recipes can be delivered securely from the cloud to the connected devices at the click of a button. The recipe push project greatly simplified the previous complex task of updating recipes for 30,000 stores in 80 markets and resulted in massive cost savings.

Longer-term, the company envisions leveraging the IoT platform for new uses such as managing inventory and ordering supplies and will encourage suppliers of its devices to build the solution into future versions of their products.

Using blockchain to share coffee’s journey with customers

Starbucks is also innovating ways to trace the journey that its coffee makes from farm to cup — and to connect the people who drink it with the people who grow it.

The company is developing a feature for its mobile app that shows customers information about where their packaged coffee comes from, from where it was grown and what Starbucks is doing to support farmers in those locations, to where and when it was roasted, tasting notes and more.

For Starbucks, which has long been committed to ethical sourcing, knowing where its coffee comes from is not new. However, digital, real-time traceability will allow customers to know more about their coffee beans. Perhaps even more important and differentiating are the potential empowering benefits for coffee farmers to know where their beans go after they sell them.

This new transparency is powered by a cloud-based blockchain service, which allows supply chain participants to trace both the movement of their coffee and its transformation from bean to final bag. Each change is recorded to a shared, immutable ledger providing all parties a complete view of their products’ journey.

This DT initiative not only empowers farmers with more information and visibility once the beans leave their farms, but also allows customers to see the impact their coffee purchase has on the people they’re supporting.

Case 4. Transforming farming through drones and the intelligent edge

Digital transformation isn’t the privilege of large companies.

Farmer Sean Stratman is using drones and the intelligent edge to get real-time information about issues like soil moisture and pests. It is the kind of information that can benefit farmers around the world.

The need to modernize farming to increase yield and reduce waste is critical - by the year 2050, the current world population of 7.6 billion is expected to reach 9.8 billion. Food production will have to increase dramatically to keep up with that growth, but there is a limited amount of additional arable land available for farming.

A new partnership between a leading tech company and a drone maker builds on the work both companies are doing with data and agriculture that could make it easier and more affordable for farmers like Stratman to quickly get the information they need to make crucial decisions about soil moisture and temperature, pesticides and fertilizer. Hours and days spent walking or driving the fields to try to detect problems can be eliminated.

The tech firm’s farm-focused program sends large amounts of data from ground-based sensors, tractors, and cameras to a computer on the farm using TV white spaces, a type of internet connectivity similar to Wi-Fi but with a range of a few miles. TV white spaces are unused TV broadcast spectrum, which is plentiful in rural areas where most farms are located, and where standard internet connections are often spotty.

New machine learning algorithms process and analyze the data, and run on the cloud-based IoT edge, which delivers cloud intelligence locally, or on the edge of a larger computing network. In Stratman’s case, the “edge” is in the barn, a seemingly old-fashioned setting for a high-tech solution. Using the IoT edge, one does not need to send all the data to the cloud; it sits on the farm and can ingest a lot of the data, apply the intelligence on top of it to generate actionable insights for the farmer. The collected data and insights are then periodically synced with the cloud.

The farm program uses the commercially available drone for the project at the farm Stratman manages. The drone company provides on-the-fly generation of stitched aerial imagery, which are used by the machine learning algorithms running on the IoT edge to create detailed heatmaps. Those heatmaps enable farmers to quickly identify crop stress and disease, pest infestation or other issues that may reduce yield. The maps are transmitted using TV white space technology to the IoT edge device located on the farm.

The drone maker says the partnership with the tech company means that both companies “can go a lot further together because we can leverage information that might be drone-based, but also in conjunction with ground-based or edge-based processes.”

“This is the first time where we’ve really taken all these different components, using some of the software we’ve developed, some of the algorithms we’ve developed, our drones and the third-party sensors – and integrated all of that into a wider solution with a partner,” says the drone company executive, “In terms of the complexity of what’s involved, in order to leverage all of these unique aspects, this is a first – and it’s really exciting.”

Stratman is using the heatmaps to help with everything from his planting strategy – for example, whether the soil temperature is ripe for seed germination – to learning where beavers had created dams along a lengthy drainage ditch, creating flooding in some of his fields.

“This next year, I’m looking at identifying the soil humidity levels that are ideal for various soil working paths rather than putting an implement on my tractor and going out and saying, ‘The conditions are less than ideal for that particular tractor implement,’” he says.

“It will be really great to look at my program with IoT edge, and results from the drone and say, ‘Hey, look, my soil humidity is at 40 percent; it’s time to put on the tiller.’ It’s going to be beneficial, saving me time and trouble over doing something the old-fashioned way, the hard way.”

The precision agriculture techniques as the result of these DT initiatives will enable farmers to use their resources – water, land, fertilizer – more wisely in the future. It will enable farmers to not only become more profitable but will increase the yield, reduce environmental impact, and help feed the world.

References

1. Acemoglu, D., & Restrepo, P. (2019). Automation and new tasks: how technology displaces and reinstates labor. Journal of Economic Perspectives, 33(2), 3-30.

2. Breazeal, C. (2003). Toward sociable robots. Robotics and Autonomous Systems, 42(3-4), 167-175.

3. Comin, D., & Mestieri, M. (2018). If technology has arrived everywhere, why has income diverged?. American Economic Journal: Macroeconomics, 10(3), 137-78.

4. Dweck, Carol S. (2008). Mindset: The New Psychology of Success. New York : Ballantine Books.

5. Eagleton, T. (2016). Culture. Yale University Press.

6. McKinsey (2018). McKinsey Global Survey on Digital Transformations (https://www.mckinsey.com/business-functions/organization/our-insights/unlocking-success-in-digital-transformations)

7. North, D. C. (2002). Institutions and economic growth: a historical introduction. In International political economy (pp. 57-69). Routledge.

8. Ramaswamy, V., & Ozcan, K. (2018). Offerings as digitalized interactive platforms: A conceptual framework and implications. Journal of Marketing, 82(4), 19-31.

9. Slater, S. F., & Narver, J. C. (1998). Customer‐led and market‐oriented: let's not confuse the two. Strategic Management Journal, 19(10), 1001-1006.

10. The World Economic Forum (2018). Unlocking $100 Trillion from Business and Society Digital Transformation. (http://reports.weforum.org/digital-transformation/wp-content/blogs.dir/94/mp/files/pages/files/dti-executive-summary-20180510.pdf)



Jonathan Zhang
Jonathan Zhang Jonathan Z. Zhang is Associate Professor of Marketing and the Dr. Ajay Menon Professor in Business at Colorado State University. His research focuses on understanding how customer attitudes and behaviors change in new economic, social and technological environments and helps develop company strategies to address these changes.
Hsiao-Wuen Hon
Hsiao-Wuen Hon Hsiao-Wuen Hon is Corporate Vice President of Microsoft, Chairman of Microsoft’s Asia-Pacific R&D Group, and Managing Director of Microsoft Research Asia. He drives Microsoft’s overall strategy for research and development in the Asia-Pacific region, as well as collaborations with academia. An IEEE Fellow and a Distinguished Scientist of Microsoft, Dr. Hon is an internationally recognized expert in speech technology. He has published more than 100 technical papers in international journals and at conferences and holds three dozen patents in several technical areas. Dr. Hon received a Ph.D. in Computer Science from Carnegie Mellon University.

Recommended




California Management Review

Berkeley-Haas's Premier Management Journal

Published at Berkeley Haas for more than sixty years, California Management Review seeks to share knowledge that challenges convention and shows a better way of doing business.

Learn more
Follow Us