skip to main content
Close Icon We use cookies to improve your website experience.  To learn about our use of cookies and how you can manage your cookie settings, please see our Cookie Policy.  By continuing to use the website, you consent to our use of cookies.
Global Search Configuration

Straight Talk Technology

Ovum view

Once, our face was our own. Several recent stories highlight the fact that this is no longer the case, and regulation is so far behind technology that almost anything goes.

CCTV has been with us for decades, and it has now become so ubiquitous in some localities, with cameras dotting power poles and building facades, that the traditional doorway signs announcing CCTV surveillance are now largely meaningless. When identifying a face needed a human operator, the cost of scanning footage to find a person was prohibitive for most purposes beyond criminal investigation. That has all changed with advances in facial recognition software. Big Brother, in many locations, can track your daily movements by simply providing an image of you to the software.

The ability of social media operators, such as Facebook, to automatically recognize and tag images and add them, without explicit permission, to any location where the operator's software "thinks" they may be of interest and to sell the resulting intelligence wherever they please, is governed mostly by commercial and reputational concerns rather than by government regulation or human rights considerations.

Retailers can harness existing CCTV to spot prime customers or known shoplifters and treat them with due attention or to map customer travel around the store and precinct. Extending the use of face recognition in retail environments beyond these kinds of applications runs the risk of "creepy" retailer behavior and raises similar ethical concerns to social media image exploitation.

Movie directors regularly replace stand-in and stuntpeople's faces with those of actors. It is also possible to seamlessly put words into another person's mouth by combining the voice track from one video onto the image from another. The result, called a deepfake when used for nefarious purposes, is a video with one person's face speaking another's words. As the technology improves, deepfakes are becoming increasingly difficult to detect.

This Big Brother is anyone who has access to CCTV footage, video, or images: governments and law enforcement, social media companies, and store owners. The information can be, and is, traded and reused relatively freely in a market worth $76bn in the US alone, according to the Democratic think tank Future Majority.

GDPR has made a good start by starting with the proposition that all my data belongs to me, and others can only keep or use it with my explicit permission. Regulation needs to follow a similar path with my voice and my image.Straight Talk is a weekly briefing from the desk of the Chief Research Officer. To receive this newsletter by email, please contact us.

Recommended Articles

;

Have any questions? Speak to a Specialist

Europe, Middle East & Africa team: +44 7771 980316


Asia-Pacific team: +61 (0)3 960 16700

US team: +1 212-652-5335

Email us at ClientServices@ovum.com

You can also contact your named/allocated Client Services Executive using their direct dial.
PR enquiries - Email us at pr@ovum.com

Contact marketing - 
marketingdepartment@ovum.com

Already an Ovum client? Login to the Knowledge Center now