A recent research at Northeastern University, in Boston, of Alan Mislove , who studies algorithms, has shown that Facebook’s software was offering users’ phone numbers to advertisers. He has also found new ways to audit that same software for racial bias, according to The Economist. However, a work like his implies taking data from public websites, which often encounters violation of terms and conditions. Moreover, the companies who own the websites are generally unwilling to give researchers more direct access to their systems.
In addition, in order to examine other people’s algorithms you have to create your own to do this. The group of Mislove often spends months just to write the code needed to gather any data about the inquiry. Thus, only those who have enough computer-science skills can study the computer programs with an important role in nowadays society.
Facebook is currently in a scandal over its handling of data and the power of its hyper-targeted advertising software, while Mislove is working with a group of researchers at the Massachusetts Institute of Technology (MIT) who think they have an answer to these problems.
The group is led by Iyad Rahwan and has taken a leaf out of the book of B.F. Skinner, an animal behaviourist. Skinner invented a device, currently known as a Skinner box, which standardised the process of behavioural experimentation. He used his boxes to control input stimuli (food, light, sound, pain) and then observed output behaviour in an attempt to link the one to the other. The Skinner box was a big advance in the field. Dr Rahwan hopes to do something similar to software using what he calls a Turing box, says The Economist.
This “box” is actually a piece of software. If you place an algorithm in it, control the data inputs and measure the outcomes, you will be able to understand exactly how it behaves in different circumstances. Anyone who wants to study an algorithm could upload it to a Turing box.
Rahwan hopes that companies will want political and social scientists to use the box to analyze their algorithms for potentially harmful flaws and that researchers will be willing to do the testing.