Tuesday, March 31, 2015

Tech Support for the Internet of Things


The Internet of Things promises seamlessly connected technologies that provide convenience, control, and simplicity. Unfortunately, the components of this ecosystems aren't yet as interconnected as consumers would like. The integration is cumbersome and disjoined, leaving numerous support gaps. While technological innovation is moving at a rapid pace, a service and support transformation is needed to unify the Internet of Things in the eyes of users. To deliver value in this new era, companies must re-examine the scope of their services and recast their offerings in light of the "Internet of Services." This is the only way that consumers will truly experience the benefits that the Internet of Things can deliver.

Click here to view full blog »

Guest Blog Post: Flipping the Lens on Sustainability Disclosure Frameworks

Written by: Nick Martin and Michael Rieger, Antea Group (MassTLC Member)

It seems that every day there is an article about the ‘Internet of Things’ or ‘Big Data’ and while we have yet to fully grasp the magnitude of this shift in technologies, it is a clear indication of where we are headed in terms of access and sharing of information.  This new operating environment with transparency and disclosure expectations is bound to increase given the state of technology, social media, and access to information.

The last decade has demonstrated the importance of transparency within the business world and we believe most companies, either through voluntary or mandated response, have embraced this as a fundamental requirement of success. The renewed calls for transparency have been amplified by simultaneous technological advancements and opportunistic stakeholders ranging from investor representatives (e.g., CDP, Ceres) to major corporations such as Walmart. The result has been an explosion of disclosure requests and expectations being placed upon companies in the United States and abroad. Many companies are feeling overwhelmed by survey requests, requiring a significant and seemingly endless allocation of resources.

Companies are increasingly being pushed into a precarious position of either declining sustainability disclosures outright or picking and choosing which surveys to respond to. Branded companies with a complex and diverse stakeholder base have increasingly begun to question the ongoing business value of disclosures compared with redirecting resources to other company investments and partnerships.

While the question of ‘do’ or ‘do not’ disclose will remain, in this blog we offer a different perspective on the business value of utilizing disclosure frameworks and requests. If you ‘flip the lens’, you can find an invaluable resource that we believe is underappreciated.    

Disclosure frameworks have completed significant research and ‘leg work’ to define the most critical and effective elements of Corporate strategies. They have engaged leading experts and companies, facilitating a rigorous development and maintenance effort to accelerate Corporate adoption of sustainability-related initiatives. The business value is that surveys and associated scoring methodologies can be utilized by companies as comprehensive gap analysis tools for developing an effective, holistic strategy and/or validating efforts to date. Furthermore, these methods are absolutely free and include a wealth of guidance information and resources (e.g., reports, webinars, diagnostic tools, etc.).

Making This Real
The following are three examples of disclosure frameworks that are primed for extracting meaningful insights and utilizing these as a gap analysis process to enhance company standards:

§ CDP (Water, Climate Change, and Forests) – an exceptionally transparent organization with surveys, methodologies, and scoring publicly available: https://www.cdp.net/en-US/Pages/guidance.aspx
Ø Questions to ask: is our strategy well aligned with the associated CDP questionnaire?  Can we answer questions affirmatively?  What are the gaps, especially for elements that CDP applies a higher scoring weight? 

§ Global Reporting Initiative (GRI) – the G4 Standard outlines a process for completing Corporate annual reports, including clear guidance on individual indicator definitions and a quality rating methodology. In addition, GRI provides a search tool to see real examples of Corporate reports at varying quality levels, which provides valuable benchmarks and examples to utilize in developing your company’s annual report.   https://g4.globalreporting.org/Pages/default.aspx
Ø Questions to ask: Which of the GRI indicators is our company able and comfortable reporting? If not able or comfortable, why not (e.g., insufficient data, not material, not a compelling story or performance level, not previously considered)? How do we compare to sector or peer leaders? Are there elements of the GRI guidance that would help our other disclosures (e.g., website material, supplier or customer communications, materiality assessment)?

§ Sustainability Accounting Standards Board (SASB) – a unique initiative that is focused upon defining the materiality of social and environmental issues for up to 80 unique sectors.  The ultimate objective is to embed the outputs of the SASB Standards into financial disclosures (e.g., 10-K, SEC Filings). The website provides: 1) Industry Briefs; 2) Industry-specific Standards; and, 3) a Materiality Map. http://www.sasb.org/approach/our-process/industry-briefs/technology-communication-sector-industry-briefs/
Ø Questions to ask: has our company acknowledged the same materiality issues as proposed by SASB for our sector?  If not, what are the differences and why? How prepared are we to align with the SASB Standards if they are integrated into financial filings?  Is there additional data we should consider collecting now to be prepared? 

In addition to these examples, there are many others that could be utilized in a similar manner such as: ISO 14001; ISO 50001; USGBC Leadership in Energy & Environmental Design (LEED); UN Global Compact and CEO Water Mandate; and Ceres Aqua Gauge and the Ceres Roadmap for Sustainability.  There are also a variety of industry and material sourcing-specific guidelines.  
While disclosure frameworks and standards will likely continue to proliferate in at least the short-term, companies can ‘flip the lens’ and extract added value by using these well thought-out resources. This approach will benefit companies to define cutting edge strategies, drive business value and prepare the organization internally, regardless of ultimate disclosure decisions

Nick Martin is Sustainability Practice Lead (nick.martin@anteagroup.com) and Michael Rieger (michael.rieger@anteagroup.com) is a Consultant in the Boston area with Antea Group

Tuesday, March 17, 2015

Guest Blog Post: How to Sidestep Automation, Augment Technology, and Keep your Job

Written by Scott Etkin of Data Informed, click here for original post.

BOSTON—Concerns about the impacts of technology on jobs are nothing new. Fears about automation replacing assembly line workers, for example, have been around almost as long as technology itself, and to a large extent, those fears have been realized. The advent of big data raises concerns that technology might be threatening the jobs of knowledge workers. But can automation have the same impact on a workforce that trades in knowledge and creativity rather than physical skill?

Tom Davenport, professor at Babson College, Research Fellow at the MIT Center for Digital Business, and a Senior Advisor at Deloitte Analytics, addressed this question today in a keynote address at an event hosted by Massachusetts Technology Leadership Council titled, “Big Data and the Knowledge Worker: Impacts on Workforce and the Economy.”

“(Knowledge workers) have had a pretty good run over the past few decades, said Davenport. “It’s been pretty tough for factory workers, before that farm workers. It’s been pretty tough for service and transactional workers. Now that same automation is coming close to home. We always thought that whenever technology took over a type of job, that humans just moved to higher ground. When farming started to go away, people moved into factories. When factories started to go away, people moved into cities and did service and knowledge-oriented work. But this time, there is no higher ground.”
Davenport said automation has progressed from manual labor to administrative and service jobs, and that knowledge worker jobs might be the next step in that progression.

“One could argue that just as we automated manual labor jobs in the 18th and 19th century, administrative and service jobs in the 20th century, that the 21st century is where knowledge worker jobs really start to take it on the chin,” he said. “I think there is some sense of historical inevitability about this that we have to address seriously.”

Davenport identified several technologies that are driving knowledge work automation, including analytics and big data, machine learning, artificial intelligence/deep learning, and cognitive computing. He said that as analytics has evolved, it has become more recommendation oriented.

“Now I think it’s important to add a set of automated analytics at the top that says we are not just going to help you figure out answer, we are going to take action on it,” he said. “We are going to make a decision and we are going to forge ahead with the action related to that decision. There are all sorts of spheres in which that is already taking place.”

Davenport identified 10 knowledge work jobs that he called automatable: lawyers, accountants, radiologists, reporters, marketers, financial advisers, architects, teachers, financial asset managers, and pharmaceutical scientists.

Augmentation Instead of Automation

As another possible result of the ongoing evolution of technology, Davenport offered augmentation – humans and computers working together to make better decisions – as an alternative to automation, in which technology simply takes over the jobs of humans.
“Augmentation means humans are helping computers make better decisions, and vice versa,” he said. “People do this by aiding automated systems that are better at a particular task or by focusing on tasks at which humans are still better. It’s an ever-changing domain.”
Currently, this cooperation of humans and machines can produce results better than either computers or humans alone. Davenport offered the classic example of freestyle chess and a 2005 freestyle chess tournament in which two amateur players using three laptops defeated both grand masters and supercomputers.

Five Possibilities for Augmentation

Davenport offered five steps to augmentation in jobs:
Step in. Learn the system, how it works, its strengths and weaknesses, and how and when to modify it.
Step up. Monitor the big-picture results of computer-driven decisions, and decide whether to automate new decision domains.
Step aside. Focus on areas that people do better than computers, such as the creative and interpersonal.
Step narrow. Focus on areas that are too narrow to be worth automating.
Build the steps. Create the automated systems.

Davenport also offered advice for knowledge workers who are concerned about being displaced by automation and how to become an augmenter.
  • Understand the ins and outs of how computers do your tasks, and try to improve them.
  • Specialize in a component of your job that can’t be done well by a computer, such as sales.
  • Write computer programs and algorithms yourself.
  • Find a narrow job niche that no one would bother to automate.

Davenport ended by implying that understanding what it takes to become an augmenter – learning, changing what you do, and a lot of work to make it happen – may determine whether you keep your job.
“There’s always the artisanal plumber route,” he said. “All those knowledge workers could get some really artistic plumber’s helpers and go to work.”

Scott Etkin is the managing editor of Data Informed. Email him at Scott.Etkin@wispubs.com. Follow him on Twitter: @Scott_WIS.

Big Data and the Knowledge Worker Summit - Feb. 27th Recap

Written by: Udi Dotan

Some of the best and brightest in the Massachusetts data and technology space gathered on the last Friday in February to discuss how the world of big data and computers might impact knowledge workers in the 21st century keynoted by esteemed author Tom Davenport.

Tom spoke on the displacement of workers over time.  In the 18th and 19th centuries, it was farm workers, in the 20th century, it was service jobs, today, it's knowledge workers that are being displaced by technology.  Artificial intelligence is growing and taking over roles that humans used to do and they will take over more work.  Computers can do many tasks faster and more efficiently than humans and computers are cheaper, easier to manage, and don't complain about the cost of healthcare. 

Should we welcome these changes as another in a string of technological revolutions that have enabled humans to flourish, or fear our new computer overlords?  Some technology leaders such as Elon Musk and Bill Gates have voiced concerns about these changes positing that AI is the most dangerous development in history and should be looked upon with skepticism.

Where do these fears come from and how did we get here?
Since the dawn of the age of the internet 20 years ago (yes, it's been more than two decades since you saw your first AOL CD), information has been generated and has flowed more freely.  With more data, they has been more desire to analyze which led to enormous growth of analytics.  Early analytics were descriptive, utilizing simple graphs and charts to understand our world. 

Today, companies are utilizing predictive and prescriptive analytics to help make better decisions (think Amazon's recommender engine).  Going forward, more and more companies are leveraging larger stores of data, more compute power, and sophisticated algorithms to automate analytics.  One such example was given by Ed Macri, Senior Vice President of Marketing and Analytics at Wayfair.  They are using analytics to personalize one million emails a day based upon their prior visits, versus a single email carbon copied to one million people.  

Another example given by a member of our keynote panel, Bruce Weed, Program Director, Global Watson and Big Data Ecosystem Development at IBM, is how Watson, who through the use of its massive library, is helping medical doctors diagnose in a much faster and efficient manner. 

What jobs are computers doing that have or will displace humans and how do we service our new masters?
Of course, no human is capable of personalizing a million emails.  But these aren't the only roles that are ripe for computerization.  According to Tom Davenport, here are some "at risk" jobs that computers can and will do better than humans:
  • Lawyers: e-discovery - combing through thousands of documents to find the nuggets of truth for specific court cases.
  • Accountants: audits, taxes - using intelligence to improve tax preparation (think TurboTax).
  • Radiology: cancer detection - using machines to read radiology reports and highlight areas of concern.
  • Reporters: automated story generation - computers can used data to generate articles for publishing (like this one? - not yet).
  • Marketing: online ad buying and personalized emails as with Wayfair
  • Financial Advisor: "robo-advisors" - generating customized portfolios for clients based on factors such as age, income, and tolerance to risk.
  • Teachers: online content and automated student evaluation - companies such as Kahn Academy, Coursera, and EdX are delivering content online.  Next generation companies such as Dreambox and Knewton are delivering adaptive learning that modifies the material in response to student performance.

Have the machines already won or is there a role for us yet?
Some companies are already leveraging intelligent machines, has this turned their offices into a wasteland where tumbleweeds are rolling through giant data centers?  In short, no, there are still plenty of things that computers can't do without us.  Davenport refers to this work as augmentation.  Computers are good at computationally complex and repetitive tasks, but they can't see the bigger picture.  Humans will be needed to identify the strengths and weaknesses of the analytics systems and algorithms.  Humans will be needed to determine the business problems to solve.  And humans will build and maintain the systems to solve those problems.

As highlighted at the conference, big data technology enables much of the analytics innovation as companies can manage with larger and more varied data stores.  Jeff Kelly, Principal Research Contributor at Wikibon believes that we are moving from early stage adoption of big data implementations built around cost savings to a second generation whereby companies with big data strategies are now focused on revenue generation and operational efficiency.  P. Gary Gregory, SVP & GM, Database Servers and Tools at Rocket Software illustrated that to be successful with such data initiatives, you need to start with a business problem and build data systems to support solutions.  Those systems don't need to include hadoop, but the purpose of the data and the definition of the data sources should be clear, otherwise you end up with a data landfill, not a data lake.

Several of the panelists illustrated that the analytics revolution has led to a greater need for humans, not a lesser need.  Ivan Matviak, Executive Vice President and Head of Data and Analytics Solutions at State Street Global Exchange says they are hiring more people, not fewer, to help build and maintain its advanced analytics capabilities.  In particular, they are aggressively seeking to hire the sexiest workers, data scientists.  EMC's data science practice spends a great deal of effort investigating and rebuilding messy data for analytic purposes. And Wayfair is augmenting automated ad purchases with targeted human buys of online ad space.

Iran Hutchinson, Product Manager & Big Data Software/Systems Architect at InterSystems led a lively panel discussion illuminating success stories at companies leveraging big data and human augmentation to gain remarkable insights.  Joe Dery, a Senior Data Scientist at EMC relayed how EMC increased revenues by mining internal contract data to optimize contract renewals.  The key to optimization was not in the volume or veracity of the data (although there certainly were large volumes of data), but rather in clarifying data definitions and educating the sales team.  According to Joe, the model generation was the simplest part of the two year project.

Gary Sloper, VP of Sales Engineering and Operations at CenturyLink uses big data to proactively monitor network activity and utilize machine learning algorithms that can detect anomalies.  Such techniques can be employed to prevent hacks such as Sony and Anthem have recently experienced.

At Care.com, Co-founder and CTO, Dave Krupinski and his team has focused analytic attention on optimizing the match rate between jobs posted and caregivers seeking jobs.  This has given them guidance on the optimal flow of applications into a job posting, the optimal number of applications per job, and the key terms that are more likely to get a caregiver hired.  The insights have led to an increase in match rate from 70% to over 80% with more opportunity to improve in the pipeline.


As the volume, velocity, variety, and veracity of big data grows and the analytics become more complex and the opportunity for a cooperative relationship between machines and humans will continue to grow and we will continue to find ways to employ technology to advance society.