Infographic Blogs for 2019
2019 was a year full of outstanding customer engagements and provocative teaching experiences across numerous universities. My eyes were opened to many new opportunities to integrate economics, design thinking, big data and data science (AI / ML / DL) to further my case for a Nobel Prize in Economics (which I’d prefer not to be awarded posthumously). That includes helping organizations:
- Monetize a digital asset (data) that never wears out, never depletes and can be used across an unlimited number of use cases
- Integrate AI into physical assets (cars, trucks, trains, compressors, elevators, cranes, etc.) that creates assets that appreciate, not depreciate, in value through the learnings accumulated through usage
So, while we wait for that call from Stockholm, let’s take a look at my 10 favorite 2019 blogs:
There are many valuable lessons that data scientists can learn from the movie “Mr. Incredible” about the challenges of creating smart products like autonomous vehicles, trains and factory robots. And maybe the biggest challenge for the development of smart, autonomous products is knowing when “good enough” is actually “good enough”. When trying to optimize the operations of these smart, autonomous products, one must be prepared to realize that the current path to performance optimization may not actually be the optimal path, and the data science team must be prepared to jettison their existing work and try a different approach that might lead to a better performing analytic model.
This is an important lesson for the creation of our AI-induced “smart” products – that there must be constant testing, learning, and maybe even some unlearning and re-starting afresh in order to find the optimal models.
“How effective is your organization at leveraging data and analytics to power your business models?”
Organizations that expect to survive the avalanche of new digital technologies must “think differently” about how they structure their operational and business models. Just applying new technologies to optimize existing operational processes is just “paving the cow path”; taking existing inefficient processes and making them marginally better. And marginal improvements won’t win the day from a business model digital transformation perspective.
One has to take the time to think…to think differently…about how new digital technologies and the resulting customer, product and operational data can be used to create new sources of customer, product and operational value.
New digital technologies coupled with advanced analytics will enable organizations to create autonomous entities that not only reduce the need for human technicians and engineers, but radically transform the sources of customer, product and operational value. These organizations can embrace the economic value curve to digitally transform their business and operational models by converting human-observed static heuristics into AI algorithms that can drive automation and maybe even autonomous operations.
Time to think outside the box. Instead of focusing on replicating yesterday’s operational best practices, grab the economic value curve by the throat and dramatically change how your organization identifies, captures and scales these new sources of operational value via autonomous capabilities.
While the actual mechanics of how a neural network works are much more complicated (lots of math and calculus), the basic concepts are really not that hard to understand:
- Backpropagation improves the accuracy of predictions of neural networks by gradually adjusting the weights until the expected model results match the actual model results.
- Stochastic Gradient Descent is used to minimize some cost function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient (slope).
In order to make a holistic AI utility determination, collaboration across a diverse set of internal and external stakeholders is required to identify those metrics against which AI model progress and success will be measured. The AI utility determination requires the careful weighing of the metrics associated with the financial/economic, operational, customer, society, environmental and spiritual dimensions.
The key to accelerating the economic learning curve isn’t just accumulating experience, but also includes:
- Begin with an end in mind by focusing on your organization’s key business initiatives
- Understand the technology capabilities…but within a business frame
- Build out the solution architecture to deliver “intelligent” applications and “smart” entities
- Use Design Thinking to drive AI organizational alignment and adoption
Blend the value creation focus of Economics with the customer, product and operational insights of Data Science and the ideation, alignment and adoption capabilities of Design Thinking to help organizations to exploit the economic value of Artificial Intelligence.
Data in of itself provides zero value as defined by General Acceptable Accounting Principles, or a “value in exchange” valuation methodology. However, if we use an economics approach – a “value in use” valuation methodology – then we have a framework for defining the value of data, which is determined by where and how the data is used to create new sources of quantifiable customer, product and operational value.
Check out “Applying Economic Concepts to Big Data” for more details on the University of San Francisco research project with Professor Mouwafac Sidaoui and myself on determining the economic value of data.
The Economic Digital Asset Valuation Theorem – which leverages the economic characteristics of assets never wear out, never deplete and can be re-used across an infinite number of use cases at a near zero marginal cost – highlights how the unique characteristics of digital assets manifest themselves at the macro-economic level:
- Economic Costs Flatten. The cumulative costs of the data and analytic digital assets flattens as the Margin Cost of the re-use of the data and analytic digital assets approaches zero.
- Economic Value Grows. Re-use of the data and analytics across future use cases accelerates time-to-value and de-risks those use cases.
- Economic Value Accelerates. The cumulative Economic Value of the digital assets eventually accelerates through the refinement of the digital asset. The analytic modules get more accurate through reuse that drives predictive model effectiveness improvements.
“If you buy a Tesla today, I believe you’re buying an appreciating asset, not a depreciating asset.” – Elon Musk
Tesla cars appreciate in value as a result of the collective knowledge / wisdom / intelligence gleaned from the operational and driving data that is being captured across the usage of the growing fleet of Tesla autonomous cars; that what is experienced and learned by one Tesla car, is validated, codified and propagated back to every other Tesla car making the collective Tesla cars more intelligent, and therefore more valuable.
My biggest personal accomplishment for 2019? No, not United 1K (ugh). I wrote my 3rd book “The Art of Thinking Like a Data Scientist”, which is a workbook that I use in my Big Data MBA classes. This blog (and supporting infographic) details what you can expect from the workbook.
I hope you enjoy the workbook. It’s one of my steps on the path in teaching business stakeholders to “Think Like a Data Scientist,” a culmination of lessons-learned working with great clients over many years.
In fact, maybe I can create an equation that pretty much summarizes my 2019:
2019 Schmarzo = Economics + Design + AI – Airplanes + Customers/Students + Value Engineering
There, I guess that’s the official end of 2019. Looking forward to flying cars, virtual reality travel, blockchain-supported teleporting, real-time drone delivery and avoiding the Terminators lined up at Starbucks for their Venti Chai Tea fixes in 2020!
By Bill Schmarzo
CTO, IoT and Analytics at Hitachi Vantara (aka “Dean of Big Data”)
Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.