I think the chat could be somehow used to create a digital twin of an employee over the years… When the employee dies it would be nice to chat with a bot trained on the knowledge of that employee.
Like “hey programmer bot - we have a problem with feature xy and nobody can figure out where in the code the data is changed… but the data clearly does change somehow…”
and the bot answers with
“yeah, I’ve build a stored procedure on the database and used a trigger on inserts of table xy - did you change the database and forgot to set this stored procedure? Here is the code… you have to do the following steps to install it…”
But how can we make sure nobody does evil stuff with it?
That is abselutly a very strong point, I’ve heard that before on a podcast somewhere… That is what I mean with “we have to rethink on how to uses these systems”. Because what then is IP??? Do I have the rights to my digital twin? We will be getting off topic here quite allot but I’d love to dive deeper in to this stuff!
I completely agree, I mean we’re entering an age where something akin to an actual clone is becomming an option and it should be discussed. Although for now it is just a basic digital twin based on your chat data within the OpenAI environment (or any other), but it has a lot of implications for both personal side as well as company/school/governmental side…
Who owns and who has right to the actual data you produce? The one who pays the acces? Or the one who created the data in the first place? This is a very big ethical issue for the upcomming years. How can we solve a conundrum like this?
On the one side I’d say it is personal and all data is your own by right (similar to IP). But on the other side the company (or other) who provides the possibility to use and create the data has some (though not all) say in this. Something like a license to use the bot with an opt-out option for both sides does sound like a new approach.
I mean you create a valuable data set within company A, you leave and turn to company B. Within the AI environment you have created content based on company A’s knowledge adding your own skills and even developed new skills. Company B hires you for your skills but part of that is “stored” in company A’s database. Who’s the owner of the data?
We could say that company A owns all data, which feels wrong because them having acces to my way of writing, thought processes, porblem solving etc, which are all personal skills, are my own and thus my Intelectual Property. This is why company B hired me and they could be outcompeted by company A because they have the same skillset in a digital twin version of me.
On the other hand, like you say, company A has some proprietary data that I have used to generate the outcome of my skills. This could be data that could be of great benefit to company B and thus outcompeting company A.
What could some possible options be?
Data is completely scrapped (with contractual agreements before hiring).
Data is revised by internal (or maybe better external) revisor who seperates company data from personal influence. Both parties get a summary of their share of the data.
Full data acces with regullatory GDRP safeguards in place.
From my point of view the companies paying the few cents are already getting the return because the employees work faster - most of the time without compansation for working faster, right? I mean when you normally need 15 years to get a burnout, then with ChatGPT it won’t take 3 for sure… So are the companies raising the salary when the speed in which they suck the health of the workers out increases?
About the data - there is no doubt that the companies and even nobody else on the planet should ever have the right to look into it. It is like reading someones mind if they are not carefully adding useless, wrong and unrelated stuff and lies into it… which only a few do I guess…
That is a valid point.
We not only need to rethink all of these ethical questions, but society at a whole.
Companies incentives are way of center. Maximizing growth at all cost is hurtfull for everything part of an ecosystem, we might also call it cancerous.
You should check out Daniël Schmachtenberger on Moloch’s trap. We have a situation on our hands and AI is making it ever more viseral.