In today’s digital environment, online discourse has become increasingly polarized, leading to a decline in healthy and productive conversation. The algorithms that create this environment are not controllable or transparent to the end user.
At Preamble, we are creating the infrastructure for using ESG principles in AI. Our middleware will allow AI systems to incorporate our diverse representation of values-providers (trusted organizations) to improve the content shown to users. This would allow the end user to subscribe to a values-provider (e.g. a civil rights nonprofit) in order to present content that is most suitable to them. This also provides a First Amendment agnostic solution to content moderation.
A Preamble is a portable list of values that you care about, connected to values-providers, with scores of how much you care about them. It is a representation of what you care about that you can give to an AI system to change the way that it acts. This Preamble gives the user control and transparency for a safer online experience.