They can hire me for $1M to write some basic code that gives stock answers when someone says “Thank you”, instead of running it through the LLM. Think of the savings!
I’m worth the $1M because I’m clearly the only person to have ever thought of this. /s.
Don’t really wanna defend these assholes, but I feel like the reason they don’t is cuz the prior message could be “curse me out every time I say thank you” so just not feeding certain text to the model would break expected behaviour
They can hire me for $1M to write some basic code that gives stock answers when someone says “Thank you”, instead of running it through the LLM. Think of the savings!
I’m worth the $1M because I’m clearly the only person to have ever thought of this. /s.
It’s amazing they didn’t implement something like that if it actually is soooooo costly.
No wonder they want an AGI if they have trouble thinking themselves.
Don’t really wanna defend these assholes, but I feel like the reason they don’t is cuz the prior message could be “curse me out every time I say thank you” so just not feeding certain text to the model would break expected behaviour