2024年7月19日金曜日

#1 Facebook in 2016 and Kawasaki Heavy Industries in 1981 have something else in common

This is not so much about cybercrime, but first of all about artificial intelligence in general. First, there is the story of the calculation of mortgage scores in the US. In Japan, there was a fraud at Bookoff in Japan, but in essence, it is about manipulating numbers. They use a score called FICO when they screen mortgages, but we don't know how this score is calculated, so if the algorithm or data were to be fraudulently rewritten, we would not know about it.

This is also the problem with democracy and asymmetries of power. I think it means that if the algorithm is not accountable, it is more likely to be used for crime.

Then there is the story of the "Facebook experiment": in mid-2014, Facebook sampled 700,000 users and completely separated new updates into sad and happy news. When we separated the users into those who only saw sad news and those who only saw happy news, the subsequent posts showed that those who saw sad news posted sad posts and those who saw happy news posted happy posts.

As noted on page 450, emotional states can be transferred to others and control emotions because they make people feel the same emotions without being aware of it. Since democracies make political decisions based on how people feel, not how they think, if this experiment is widely published, it will affect democratic decisions because human emotions can be controlled.

This can also be said of election interference by Russia. In the end, democracy is not about how we think, but how we feel, so the abuse of social networking and algorithms will destroy democracy.

The story of Kawasaki Heavy Industries in Japan in 1981."

In the story of Kawasaki Heavy Industries in Japan in 1981, it seems that the AI misidentified the situation. I don't know if they called it artificial intelligence at the time, but it misidentified an employee as an obstacle, and the employee died. As automation and mechanization progresses, there is a total reliance on algorithms, and when misrecognition occurs, the entire system becomes fragile and more damaging.

Recently, there has been talk of hostile data, which, if used for criminal purposes, can lead to issues such as democracy and Russian election interference. In the future, IoT, industrial networks, and factory networks could also be exploited by ransomware if the control software is maliciously rewritten.

0 件のコメント:

コメントを投稿