Not so long ago, the public knew who was shaping its opinions. These were newspapers, editors, news anchors, and broadcasters. One could agree or disagree with them, but at least it was clear where influence was concentrated. Today, part of that power center has shifted to an invisible system: algorithms. Human editors no longer decide what we see; mathematical formulas do, designed to maximize emotion, screen time, and profit.

Social networks initially promised a revolution of freedom. Anyone could speak, publish, and influence. In practice, they have produced a system more centralized than any printed newspaper. A handful of technology companies now wield unprecedented power to shape consciousness. The algorithm does not ask what is important to know, but what will keep us engaged. And what keeps us engaged is often anger, fear, and polarization.

This has created an economy of emotion. Moderate content rarely survives. By contrast, sensational headlines, conspiracy theories, and extreme videos are rewarded with broad distribution. The algorithm is not ideological; it is indifferent to truth. It rewards engagement.

(credit: INGIMAGE)

The result, however, is a public sphere in which truth struggles to hold its ground against sophisticated emotional manipulation.

The problem is not only lies. The problem is that each user receives a personalized reality. Two citizens in the same country may inhabit entirely different information worlds. This is no longer a political disagreement; it is the disintegration of a shared language. Democracy relies on agreed-upon basic facts. Without shared facts, there is no foundation for public discourse.

Moreover, the algorithm learns us better than we know ourselves. It knows when we are tired, when we are vulnerable, and what triggers us emotionally. This information is used for advertising and for political influence. Modern campaigns are not aimed at the public, but at the individual. Each person receives a different message. This is personalized politics, nearly impossible to oversee.

How algorithms amplify outrage

Here lies the paradox: Never has information been so available, and never has it been so difficult to know what is true. Information overload creates cognitive fatigue. The public is pushed toward simplification and empty slogans. The algorithm does not merely reflect polarization; it amplifies it. It learns that outrage performs better than discussion, and so it serves us more of it.

Today, social movements are all influenced by code written in Silicon Valley. Technology companies claim neutrality, but algorithmic neutrality is a myth. Every programming choice is also a value choice: what to promote, what to hide, and what to measure as success.

The solution is not government censorship. Transparency is a necessary condition. The public must know how the mechanisms that shape its consciousness operate. Smart regulation can enforce accountability without suffocating freedom of expression.

In a cynical world where artificial intelligence is often used to spread lies and manipulate video and audio to create chaos and despair, there is another use, one that seeks to help build a better world, across borders: the use of AI for emotional support.

AI for emotional support

In the United States and Israel, a group of scientists, some of them Israeli, from MIT, has for the first time developed a unique system designed to assist people suffering from post-traumatic stress and anxiety, including soldiers returning from combat zones and civilians exposed to extreme events.

The system, called MyWave, is available as a desktop app or a mobile app. It offers an advanced integration of artificial intelligence with real-time physiological metrics. At its core is a repository of calming music, including tracks uploaded by singers and artists, which users can access instantly whenever needed.

The system’s uniqueness is not limited to music. The app connects to a smartwatch or ring that collects physiological data, including stress levels, heart rate, sleep patterns, blood pressure, ECG readings, and oxygen saturation. These are combined with vocal analysis while the user speaks into the app’s microphone.

As users express their emotions, the AI gradually learns to recognize their patterns, triggers for anger, mood fluctuations, and emotional responses. Based on this data, it selects precise musical frequencies designed to calm and restore emotional balance.

In the initial stages, use of the system is supervised by a psychologist, either in a group setting or individually, to ensure professional oversight. After an adjustment period, users can rely on it independently at home, even late at night, when anxiety arises, and no professional is available. In such moments, a phone and headphones become an immediate, accessible support tool.

The system has already been tested at a university in Boca Raton, Florida, as well as in Israel among trauma and anxiety patients. Meetings with senior figures in Israel’s health care system, including psychologists and psychiatrists, have shown strong interest in the therapeutic potential and the novelty of an approach unlike any used before.

One particularly harrowing case involves a soldier from an elite unit in Jerusalem who experienced severe trauma: His partner was shot and killed before his eyes, and he himself was wounded. Despite professional psychological treatment, he was unable to sleep. The trauma was profound. After several sessions using the MyWave app and its music, his anxiety threshold dropped significantly, allowing him to sleep.

Development of the system is expected to continue for several more months, after which it is slated to become available to hospitals and medical centers treating post-traumatic stress and anxiety disorders. MyWave may mark a breakthrough in trauma treatment.

The combination of advanced technology, music, and human emotion offers new hope to thousands of patients. According to published reports, more than 50,000 soldiers from the first circle alone were affected by the October 7 events.

The project is led under the guidance of Dr. Yaron Seidman of Florida, together with Jay Buck, a Jewish philanthropist and supporter of Israel, Steve Benedetti, and senior figures from Wall Street. The initiative will be offered free of charge to hospitals, rehabilitation centers for wounded veterans, and government institutions treating anxiety and trauma patients, as a contribution to the State of Israel.

The author is the CEO of Radio's 100FM, an honorary consul and vice dean of the consular Diplomatic corps, president of the Israel Radio Communication Association, and a former Israel Defense Forces Radio correspondent and NBC television correspondent.