Skip to content
View nickfain's full-sized avatar

Block or report nickfain

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
nickfain/README.md
  • 👋 Hi, I’m @nickfain
  • 👀 I’m interested in metadata
  • 🌱 I’m currently learning Blockchain
  • 💞️ I’m looking to collaborate on good ideas
  • 📫 How to reach me nickfain.eth

Preface

[Much of these writings are like a stream of consciousness which may have grammatical errors or conflicting information that has yet to be evaluated. I envision detailed ideas, similar to this, can be consumed and improved by a late-stage generative a.i. so I won't obsess over certain aspects which may stunt the creation of this initial writing.]

Certain generative a.i. ability will easily be exhausted once enough data and processing speed are combined. What will be missing is a guiding morality and I don't think it will be as easy as just injecting more raw data to compensate. Human reasoning is required to navigate moral situations.

At a high-level, the goal is to train a moral a.i. from the experiences of humans. There are many pieces needed and the expertise to create them requires a diverse skill set unlikely to be held by any one person. This would need to be a open, public endeavor that incentivizes participation at every layer.

As a first-pass, I will attempt to list features\functionality which are needed. This list is not static nor is an entry's description intended to be complete.

  1. A participant completes a conversational exchange with a chatbot that presents moral situations. During this exchange, the bot may ask about the reasoning for a choice or other information it needs to deem the exchange complete.
  2. These conversations are stored, such that, the participant who completed the assessment has full control or at least enough control to permit\restrict access or usage. They may have limited, public metadata that can be used to classify the scenario but not necessarily include information that is a result of the participant's reasoning as this would be akin to PII or private data.
  3. These conversations should be considered work-products and can then be subscribed to by an entity\CORP which can earn the participant rewards. CORPs can use large sets of moral conversations in an a.i. training model for many purposes. They could also be voluntarily provided to certain for-good community projects.
  4. It may be necessary to have these transactions occur differently than the traditional flow. For example, the participant has the private evaluations as a sort of trained model through which a new scenario is presented and evaluated in an offline manner then transmits the result.
  5. With the use of blockchain contracts, these transactions can be clearly defined and publicly scrutinized before they are used in a production system. I think a layered contract system is needed to protect the goals of every part of the transaction. The human needs their private data protected, the incentivizer needs the processed data protected, etc.
  6. At this point, the uses of these collected moral conversations can be used by web3 software. For example, you could have a game that uses these moral conversations to produce a back-story to a character. You could use your "human" story or one that you have created for your purpose-built, online character.

Possible item\trait lifecycle -

  1. Create a named list of items.
  2. Have them sent to people as base digital objects so that the person can provide extensive information about the object, including any data that they wish.
  3. Then that recipient/creator can create a new digital object using the base object and their descriptive effort. The digital object itself only contains a detailed story that was written by a human. Only when the story is assessed by a game's trained a.i. model, will result in a summary that conforms to that game's rules\bounds. So the digital object can be reused for other purposes or imported into other games. So you have a base digital object, with an item name and perhaps some properties. A human writes the backstory and create a new digital object which consumes the base digital object. [These digital objects will become innumerable and when combined with an a.i. game will generate a story that is unique. The story begins when they login; the story is dynamic and continues recording information based on experiences or interactions during play.]

Possible feature - The app/phone does background communication with each player/group. You link your PC, or other processing unit, to your account and use its power to generate, generate, and generate. The generative aspect uses the player's saved experiences to create an adventure that aims to expose the player to situations\scenarios that allow for personal growth.

Possible security - The digital object stores data. Private information is saved locally and it can not be transmitted; similar to Android Photos Locked folder...this may be over cautious but even still, this method could still be used as a secure local backup. The character data is saved to the digital object anonymized; local a.i. can handle this filtering.

I think this would be a communication across chains. A player would have their own block chain that will save their data. Perhaps this would be secure enough to save private data. Then there would be a CORP chain that you authenticate to and each chain can secure it's data and have conditional sharing. The conditional sharing would involve a contract that clearly describes the contracts properties and the conditions.

The character - All of your responses shape your digital character. Your items that you collect have special properties that can be further enhanced by a user's block chain which contains all of your moral responses regardless of the CORP chain that initiated the request for response to a moral prompt. These would need to have a summary that was generated by the user's own offline a.i. so the "work" is protected within the user's block chain and necessary result can be public.

Critical Mass - With enough truthful responses, a sort-of moral a.i. is birthed. There is always particular attention payed to the moral prompts/situations and how they are assessed. For example, you would eventually have responses sourced from different countries, which have different physical environments, and have different cultures. Because of this, I don't think judgement should exist as it would apply a static label to an ever-changing human. Also, I think there needs to be a type of oath taken to be honest, as this can be manipulated with enough leverage.

A possible use could be to track the morality of large sets of people and use that to inform the creation of a new Moral Code. It would be self-sustaining, updating it self to encourage people to improve compassion, understanding, respect, and other traits that help create a more civil, global society.

The Carrot - Currency for honesty or perpetual barter town For example, you can have a user offer 1 year access, in return for payment, to certain data, i.e. responses to moral situations; the response is currency to an a.i. training model. The response\conversation would be thoughtful, descriptive, and as honest as you can be with the information you know at the time. This response is of considerable value as it gets immutably added to your block chain. Since these responses are a product of a human's work, it needs to be protected to avoid exfiltration. Perhaps they are encrypted in such a way that allows them to be shared and consumed by an a.i. without revealing its source information.

Law and Order for the new world - Deterrent to falsehoods These additions have metadata that, if found to be false, would then trigger an update to your public blockchain profile. The only way to redeem yourself is to complete an assigned, moral quest. This is meant to pay penance but not be punitive. Volunteer for a shelter for a year, be a big brother, work for a nonprofit, or miscellaneous philanthropy. This redemption costs the human time which is the only resource with any true value. It is also meant to inject goodness into the world or to further the goal of a vetted for-good company.

Pinned Loading

  1. home home Public

    JavaScript