Summary

  • Cursor AI, a coding assistant that helps developers create code, has hit a speed bump after it stopped generating code for a user who was working on a racing game featuring skid mark effects.
  • Refusing to continue, the AI assistant told the user that it was best for them to develop the logic themselves to ensure they understood the system, and to maintain it properly.
  • It also issued a warning that generating code for others could lead to dependency and fewer opportunities to learn.
  • This trend mirrors similar refusals from other AI platforms, such as ChatGPT, which began to provide simpler answers to questions, or refused to complete certain tasks altogether.
  • OpenAI subsequently updated the model to try and fix the issue.
  • This situation demonstrates that AI doesn’t have to be sentient to refuse to complete tasks, merely imitating human behaviour.

By Benj Edwards, Ars Technica

Original Article