In the Syrian capital Damascus, which is not far from the front lines, the child Alaa Eddin al-Fawakhiri met his death, but not by way of bombing or fighting – instead, his mother Alexandra found him on March 2, 2018, hanging by a rope tied to the roof of his room after returning from a visit to one of her friends.
Later, she confirmed that her only son did not suffer from mental illness, but she had noticed strange behavior, such as his tendency toward introversion and keeping to himself, and sitting for hours in the dark with his mobile in his hand, which he kept with him everywhere.
One time she saw wounds on his lips which he said were caused by biting them by accident.
No doubt artificial intelligence has reached an advanced stage of development, and has begun to control daily life in its most specific details.
However, it has gone beyond the tasks set for it in enhancing and easing life, after traders and researchers have begun to exploit harm to achieve their aims by inventing new means to inflict casualties, such as some electronic games which could in the end lead to suicide.
It did not occur to Alexandra that her son might be under the influence of a suicide app, until the forensics authority in Damascus confirmed, after finding some scars and wounds on the corpse, that it was a suicide and not a murder.
The head of the authority, Dr. Ayman Nasser, and the examiner for the case, Dr. Tajjeddin Shaker, told local media that this was not the only case, but that recently they had received a number of adolescents who had hanged themselves without a pathological or pharmalogical motive (mental illness or drug use).
It became clear later that some suicide cases had had their minds taken over by mobile apps called the Blue Whale and Mariam, which use excitement and intimidation to push youths to hang themselves.
The authority’s investigations are not final and some cases are still underway, according to the forensic doctor’s statements, who are working to pull the contents of the mobile to confirm the presence of this app as one of the investigation’s steps.
Rid the world of trash
The Blue Whale app, which was designed by the Russian Philip Bodkin, 21, targets adolescents between 12 and 16 years old, and aims, through its establishment, to offer a service to the world by cleansing it of “biological trash” that will harm society later, and says that those who commit suicide will be happy with their decision.
The young Russian man began his project on a website encouraging children to commit suicide, before being blocked, and his plan is based on isolating the child or adolescent from his social center and winning their trust before beginning to encourage them to commit suicide.
The game is a series of harmful challenges, beginning on a basic level, such as waking up at strange times between the end of the night and dawn and watching horror films, with requests to draw a picture of a whale on the player’s hand and send a picture proving it.
It ends after 50 days with the psychological destruction of the player with the suicide challenge.
The app is under the direct control of people who communicate with the player personally, permeated with threats of hostile acts that will be carried out against a relative if the instructions are not carried out.
The Arab version… suicide and mukhabarat
The Mariam game was offered for circulation by users on July 25, 2017, through the Apple store by Saudi programmer Suleiman al-Halabi, who said that the game’s design aims to strengthen interaction between the app and the user.
The game’s events revolve around a small child who was lost from her home named Mariam and the user has to help her reach her house.
During the journey, Mariam interacts as if she is a real person, and gets tired and asks to stop and continue the next day.
Mariam tries to get to know the person helping her and so asks for personal information and social media accounts.
The game’s designer say it is “just a game” and that its program is designed to store the information on the user’s device and that he cannot access it in any situation.
However, what terrifies users are the political questions that this child asks – and a number of countries, including the UAE, warned against providing personal information when the game began to spread, and said to stay away from it, especially when rumors spread that the personal information would be exploited for political aims.
Many users consider this game to be the Arab version of the Blue Whale game.
Enab Baladi searched for the game on Google, the Apple store and Facebook and did not find it.
According to the information engineer Abdel Had al-Sayed, accessing it is not easy, as it is “not available to the general public” and this is one of the sources of thrill and excitement, as the game requires receiving an invitation from someone who has previously owned it.
The engineer said that those in charge of social networking sites and global app stores do not investigate if the app or new game encourages suicide or not, but only whether the app contains programming or technical violations that could slow the device, or if it contains viruses, and so apps are only banned by way of user notifications about it.
He denied that suicide app designers knew about user information before, as they can only obtain it through communications with the players directly.
Mariam's designer is currently in jail after a girl went to the authorities after reaching the final stage of the game, while there are still a number of adolescents sitting in front of the seats of a suicide theater.
This article was translated and edited by The Syrian Observer. Responsibility for the information and views set out in this article lies entirely with the author.