第69页
- 第1页
- 第2页
- 第3页
- 第4页
- 第5页
- 第6页
- 第7页
- 第8页
- 第9页
- 第10页
- 第11页
- 第12页
- 第13页
- 第14页
- 第15页
- 第16页
- 第17页
- 第18页
- 第19页
- 第20页
- 第21页
- 第22页
- 第23页
- 第24页
- 第25页
- 第26页
- 第27页
- 第28页
- 第29页
- 第30页
- 第31页
- 第32页
- 第33页
- 第34页
- 第35页
- 第36页
- 第37页
- 第38页
- 第39页
- 第40页
- 第41页
- 第42页
- 第43页
- 第44页
- 第45页
- 第46页
- 第47页
- 第48页
- 第49页
- 第50页
- 第51页
- 第52页
- 第53页
- 第54页
- 第55页
- 第56页
- 第57页
- 第58页
- 第59页
- 第60页
- 第61页
- 第62页
- 第63页
- 第64页
- 第65页
- 第66页
- 第67页
- 第68页
- 第69页
- 第70页
- 第71页
(D)
①Recently I was working on an overseas museum project.When I asked ChatGPT to give an introduction to the Li Gui,an ancient Chinese artifact(文物),it began introducing that this artifact was made by Di Yi,a ruler in the Shang Dynasty,to honor his father.However,this was completely false information.According to the museum,the Li Gui was actually made in the Western Zhou Dynasty.
②I then asked DeepSeek about the academic sources(学术来源)for the use that the Li Gui was to honor Di Yi’s father.It listed a lot of references(参考文献)which actually don’t exist at all.
③Here comes a hot topic:AI hallucinations(幻觉).They are situations where an AI system produces information that seems true but is actually incorrect,false,or totally made up.This can happen when an AI provides false facts,invents sources,or creates misleading references,often with a high degree of confidence.
④With ByteDance’s latest AI technology,a photo can be turned into a lively video.For example,it can create a speech video from a single photo of a person,with natural expressions and gestures.If a photo taken on the street is used,the AI system can even make up background people and moving cars.Now,can you sense any possible risks?
⑤Imagine if this technology

①Recently I was working on an overseas museum project.When I asked ChatGPT to give an introduction to the Li Gui,an ancient Chinese artifact(文物),it began introducing that this artifact was made by Di Yi,a ruler in the Shang Dynasty,to honor his father.However,this was completely false information.According to the museum,the Li Gui was actually made in the Western Zhou Dynasty.
②I then asked DeepSeek about the academic sources(学术来源)for the use that the Li Gui was to honor Di Yi’s father.It listed a lot of references(参考文献)which actually don’t exist at all.
③Here comes a hot topic:AI hallucinations(幻觉).They are situations where an AI system produces information that seems true but is actually incorrect,false,or totally made up.This can happen when an AI provides false facts,invents sources,or creates misleading references,often with a high degree of confidence.
④With ByteDance’s latest AI technology,a photo can be turned into a lively video.For example,it can create a speech video from a single photo of a person,with natural expressions and gestures.If a photo taken on the street is used,the AI system can even make up background people and moving cars.Now,can you sense any possible risks?
⑤Imagine if this technology
答案:
D
查看更多完整答案,请扫码查看