That’s not the case,” he claimed, noting “we’re not the only company” to see the need for such an experience. “Critics.will see this as an acknowledgement that the project is a bad idea. “We’ll use this time to work with parents, experts and policymakers to demonstrate the value and need for this product” and “continue to build opt-in parental supervision tools for teens,” Mosseri wrote in an Instagram blog post, in which he seemed to at once acknowledge the need to put the plan on ice and dismiss reasonable conclusions one could draw from the company doing so.
As important as learning to code is understanding the ethical implications of what is being coded.Instagram head Adam Mosseri announced Monday that the company is “pausing the work” on Instagram Kids, an ad-free version of the photo sharing app intended for children under the age of 13 that has prompted a barrage of criticism since plans to build such a product were first reported back in March. And we need to foster ethical awareness around technology in young minds: they are tomorrow’s programmers. We need to further explore the potential of open-source designs – those not driven by profit – as alternatives. They fail to find the balance between usability and security (and privacy). Our research shows that many online applications are not fit for use. The usability movement which started in the late 1980s therefore now needs to make way for what computer scientists term usable security: human-centric design, where safety takes precedence. The concern these days is that privacy is under threat because profits take precedence over safety. Longfield’s case alleges that the video-sharing platform harvests the personal information of its under-age users for targeted advertising purposes: from date of birth, email and phone number to location data, religious or political beliefs and browsing history. These include the case launched by the former children’s commissioner for England, Anne Longfield, against TikTok. And the way designers know what we want before we want it comes down to the data they collect on us – and our children.Ī flurry of recent lawsuits, however, highlight the line, in terms of harm to the user, that such digital innovation driven by profit and shaped by our personal data has crossed. Tech giants talk about meeting our expectations even before we know them ourselves.
Indeed, keeping devices and all they contain in use is central to IT design: the user is a customer and the tech is designed to nurture – sollicit, even – that custom.įiguring out how to provide a meaningful and relevant experience for someone using a digital product or service, from devices to social media platforms, is what is known as user experience design. Households are filled with devices and applications which are usable, useful and being used. Technology, of course, now enables how we live, how we communicate, how we interact, how we work. ClassicStock / Alamy Stock Photo From user experience to user safety The 1980s enthusiasm for early personal computers belies the fact that they weren’t that useful - or usable.
But can we trust big tech to give us what we actually need as opposed to manipulating us into consuming what they need to sell? Head of Instagram Adam Mosseri has been quick to defend the value and importance of a kids’ version of the app. And they’re tech companies’ future customers. They’re the future designers of our tech – they will inherit our messes – but they’re also using it right now. Research increasingly confirms it can be a source of harm too.Ĭhildren are at the heart of this battle between usefulness and safety.
And social media is no different: it has gone from being the thing we didn’t know we wanted to being embedded in all that we do. In the late 1980s, research had already highlighted that the history of computers was arguably one of creating demand more than responding to need.
Quite who has asked for what, in information technology development, is an interesting question. “Instagram for kids,” ran one headline early on, “the social media site no one asked for”. The study’s findings, not to mention the fact that they were withheld, have only bolstered the heavy criticism the project initially came in for. This follows reports that the social media giant had commissioned – and kept secret – internal research that found Instagram was detrimental to young people’s mental health. Facebook has announced that it is halting development on its Instagram Kids project.