AI is a tool that should be used in conjunction with other tools.
When I was at school, If I wanted to research a topic I went to the library and looked in a book (or possibly two books). Nowadays, we don't expect a student to do that - or at least not to solely do that, we expect them to use the vast array of information on the internet.
But equally we don't expect them just to look in Wikipedia - they need to use a variety of sources. Chat GPT is just another source. It might be useful to see how it's structured its answer, but only as additional information to (say) the teacher setting out the structure in class, or looking at model answers to similar questions online. Students should also be aware, in the same way as not taking things on Wikipedia as gospel truth, that ChatGPT may not be correct, may not be fully correct or may not have done things in the best way.
When I was at school it would have been "wrong" to just copy a section out of a book, but fine to use the information as part of a longer essay written in your own words.
5 years ago, it would have been "wrong" to just copy the Wikipedia entry, but fine to use this as one source of information along with multiple others, collated and summarised.
Today, it's wrong just to blindly copy ChatGPT, but fine to use this as a basis of obtaining information and some hints on structuring an answer as long as this is used alongside other sources.
The key thing in all case is that the human added value by doing something themselves that the "tool" couldn't.
Use of ChatGPT is more of an issue in maths/science where there tends to be one answer and there may be no "value add" to add. I would not suggest your child puts their maths homework through ChatGPT if they don't know how to do it :)