A colleague of mine now told me that ChatGPT as advanced to a state where homework assignments can be treated, at least in the first courses in Analysis and Linear Algebra. This was to be expected. When I last tried the bot not long ago, the results were unusable to ridiculous. I now retried, and it seems to provide usable answers, and even complete answers for simple problems.
Grading written homework was always difficult legally, because students can argue that they are treated unfair against those students who got help with their homework. So I never did it. The only solution is to grade the presentation of the solution in an oral exam. But our university dislikes multiple examinations in one semester to relief the stress for the students. I can understand this in view if the workload that is expected today. Keeping them busy all week may be good to teach short term learning and test writing, but not a deeper understanding.
But the topic of today is different. Can ChatGPT really be used for the homework? Could it be used to learn from? I still have my doubts. To check, I did three test.
(1) Prove
\exp(x) > x \qquad\text{for all $x \in \R$}
(2) Find all functions f(x), defined for all x>0, such that
f'(x) = f(x)^2.
(3) Prove or disprove:
\sum_{n=1}^\infty a_n \quad \text{converges} \implies \sum_{n=1}^\infty \frac{a_n}{n} \quad\text{converges}
The first two are simple exercises. The last one is too difficult without a further hint. But I wanted to see what ChatGPT makes of it. Actually, all three answers were not really satisfying. I did not use Latex notation, but all problems were correctly repeated.
(1) The bot correctly defined a function f(x) = exp(x)-x, and stated that it needs to prove f>0 now. That is good. Many students don’t write this down. Then it correctly proves that f'(x)<0 for x<0 and f'(x)>0 for x>0, using properties of the exponential function which I would accept as know. With f(0)=1 they would now be finished if they simply cited facts about monotonicity.
But at that point, they continue to compute f“ and the limits as x approaches infinity. That is completely unnecessary, and it is not clear why it should help to establish monotonicity. I have read that kind of chatter from students. It seems to be a high school thing to always use f“ for a complete analysis. In most cases, this is not needed. Anyway, I would have accepted the proof.
(2) As expected, the bot knows how to solve differential equations of that type. But it completely missed that I wanted the functions to be defined for all x>0. Okay, a minor oversight.
(3) This is not solvable by students, unless they new the trick by Abel. I just wanted to see what the bot makes of it. ChatGPT „found“ a counterexample. In the German version, it explicitly „proved“ that
\sum_{n=1}^\infty \frac{1}{n \log(n)} \quad\text{converges,}
and
\sum_{n=1}^\infty \frac{1}{n^2 \log(n)} \quad\text{diverges.}
To get to this, the bot used the integral criterion the wrong way around. How that happens, I don’t know. It also did not state the necessary conditions for this criterion. The English version was even worse and the bot gave up in the middle of a sentence.
The trick to do this can be found in the net easily. So I am bit surprised about the results. i suspect that the phrase „prove or disprove“ is the problem. It is not something ChatGPT can continue to talk about. It must instead make a decision. Another confusing type of problems might be multiple choice questions.
In case you are wondering about (3). You need to use a trick by Abel (I believe), called summation by parts. It’s like integration by parts and goes like this.
\sum_{n=1}^N x_n (y_{n+1}-y_n) = x_{N+1}y_{N+1} - x_1 y_1 - \sum_{n=1}^N (x_{n+1}-x_n) y_{n+1}
Abel applied this to much more tricky questions than simple summation. It is similar to integration by parts. The prove of this is a nice application of induction. Or you can check that you have the same products on both sides.
To make this work for the problem (3), set
x_n = \frac{1}{n}, \quad y_n = a_1+\ldots+a_{n-1}, \quad y_1 = 0.
Then it becomes
\sum_{n=1}^N \frac{a_n}{n} = \frac{y_{N+1}}{N+1} + \sum_{n=1}^N \frac{y_{n+1}}{n(n+1)}
Now, the right hand side can be seen to converge.
Note, that this still does not mean that the left hand side converges absolutely. In fact, it is possible to give a counterexample. I leave that to the reader.