The Unintended Consequences of New Technologies

Unintended Consequences of New Technologies

Four recommendations for a small company to mitigate risks.

 

Automating business processes and decision making helps businesses become more efficient, more responsive and reduces manual operations and costs.

New technologies and services that use artificial intelligence (AI), machine learning (ML), predictive analytics and even rule-based automation engines make it easy for companies to incorporate into their existing processes and jump start an automation journey.

AI and ML also promise to improve a business’ customer interactions by providing real-time, customized responses and services.  Companies benefit from the detection of risks, like fraud, and can resolve before harm.

While all of these promises can be fulfilled with these technologies, there are also risks for both the companies and their customers. Understanding these risks, then consciously deciding if the risks are worth the reward helps mitigate risks.

There are 3 types of ‘inherent risk’ in automating decision making using the latest technologies like artificial decision making, machine learning, predictive analytics and rules-based automation .

  1. Unintended Biases
  2. Unintended Outcomes
  3. ‘Creepiness’

1. Unintended Biases

Automating decision making requires data as an input.   In the case of AI, ML and predictive analytics, decision making may require large amounts of data, both past and present, to feed the software engine, to identify and “learn” patterns,  make predictions and then learn from those decisions to make better decisions next time.

This data should come from internal company sources and external public data.  It can be structured (ie. transactions) and unstructured (i.e  photos, website activity )

Unfortunately, acquiring sufficient ‘good’ data to feed and test these systems is difficult. Often historic data does not exist, or it exists, but is incomplete or lacking.  Other times, the data is poor quality and the originating sources of the data is unknown or has built in bias.

As examples, in a 2018 study of facial recognition, the tools mis-identified dark skinned women more than light skinned men.

This was caused by insufficient input data for dark skinned women to allow the ‘engines’ to decide the patterns with the right level of accuracy. Imagine if this had been used in law enforcement applications or in apartment security applications?

In addition to the bias in the input data, the algorithms themselves can have biases engendered by the unintended bias of the business analysts or data scientists than include or exclude criteria or variables.

As an example, an Insurance company’s credit risk algorithms only used a criteria of high mileage and address in a particular state to establish credit worthiness, resulting in many customers in poor neighborhoods deemed to be poor credit risks.

However, when the algorithm was enhanced with more criteria and more data, many of those initially flagged with a poor risk rating were now approved.

2. Unintended outcomes

Happen when the results of the automation cause additional, different results than planned.

As example, an Australian fitness app that tracks fitness tracker routes showed routes in Middle East war-zone regions that were found to compromise western military intelligence. Oops !

3. “Creepiness”

It happens when the results of the automation is what was planned but produces a negative reaction from your customers or employees.

For example, would your customer feel your ‘tracking software’ was creepy if it followed you around the grocery store and provided coupons as you passed by certain product aisles?

What if your own car navigation system detected that you had an accident and automatically called your insurance company for you so that an agent could be available on site?

In all of these cases, the risks are usually not legal, as in a violation of privacy laws,  but rather a risk to the reputation and brand of the company.   And the ‘trust” the brand has built with the customers.

Next: Four recommendations for a small company to mitigate risks

RELATED POSTS

“I Don’t Like My Increase!”

“I Don’t Like My Increase!”

3 business owner tips on how to deliver base pay messages By now, your company has likely completed its merit cycle, and the responsibility of delivering the merit increase to your employees falls on your shoulders. These communication tips can serve as your guide to...

Video Gallery

Polls

Sign Up for the Latin Biz Today Newsletter

PR Newswire

Featured Authors

Innovation & Strategy

Money

Talent/HR

Legal

Marketing

Culture

Fashion

Food

Music

Sports

Work & Life

Mindfulness

Health & Fitness

Travel & Destinations

Personal Blogs

Pin It on Pinterest