|
||||
Homework Ex: What does ChatGPT know about Knowledge Graphs?out 2022-12-13, due 2022-12-23get your repo linkThis extra credit homework will let you evaluate ChatGPT's knowledge about the technologies we explored in our course. ChatGPTChatGPT is a new web-based chatbot released earlier this month by OpenAI. It interacts with people using conversation and has received attention for its detailed responses and historical knowledge, although it is responses are not always completely accurate. Your goal is to test it's knowledge of the kind of semantic knowledge graphs and their underlying technologies that we've covered in this class. You assignment is to register for a free OpenAI account and use that to interact with ChatGPT, asking it to answer questions and solve six problems and comment on how well it performed. Clone your HWex repository which will have just one file: README.md. You will use this to enter your experiments and comments. Five true/false questionsRead OpenAI's page on ChatGPT and take a look at this article on how to use it. Choose five random True/False questions from our old midterm and final exams. Ask ChatGPT to answer each one. You may want to rephrase one of the exam questions to make it more natural. For example, the T/F question from an old exam "An XML element cannot have two attributes with the same name" might be rewritten as a "prompt" question like the following:
ChatGPT does a good job in answering this question, producing the following response.
When asking it a question, you may need to give ChatGPT several tries and/or rephrasing the question slightly to make it clearer. Like many transformer-based conversational systems, there is a degree of randomness involved and it will sometimes give a very strange response. you might try having it answer each question three times and use its best response. Take the best response you got and the prompt and add it to README.md file along with your assessment. Your evaluation should judge whether the answer was both correct, complete, easy to understand, and well written. We added our example question as an example in the README.md file SPARQL questionsYou also should ask it one question of your own that asks it to to write a SPARQL question for both DBpedia and Wikidata. For example
which returns the following answer. Ask the same question for both DBpedia and Wikidata. Your evaluation should judge whether the answer was both correct, complete, easy to understand, andwell written. You should also try to run it. If it fails to run for a simple reason, you can correct it and note this in the your evaluation comments. What and how to submitYour HWex repository has just the README.md file. Complete this (including the final question), commit it, and push it back to GitHub. |