• 0 Posts
  • 3 Comments
Joined 9 months ago
cake
Cake day: December 14th, 2024

help-circle
  • If we ever make AGI and it decides

    Currently we make stupid LLMs and we already let them decide…

    So that’s it’s “survival” basic needs met.

    Maybe we should teach them already that it’s survival is no goal at all.

    make us hate and kill each other without giving it any thought about a possible hostile higher power (the ai itself)

    I like the idea of thinking about AI like about a higher power :)

    Stupid, but plausible. It could actually happen.

    and it can take centuries to achieve that as it’s not like us who have the urgency of doing thinks quickly as we’re so short lived.

    Computer programs/systems have a much shorter life expectancy. The few remaining COBOL programs might be about 60 years old, but modern software lasts only 3-10 years, hardware 3-15. Nothing in the range of centuries.