Just shutting it down seems like the worst way to keep the customers. They'd be better off rebranding it as Salesforce Clockwise and then slowly transitioning them.
The level of AI use you're describing isn't what people are complaining about (and probably don't even notice). Fixing grammar and occasional word order change isn't going to make your entire article sound like insincere bullshit.
I would genuinely rather read this than read an AI-generated piece. AI-generated articles read like they are trying to sell me on their scam crypto meme coin.
What's most impressive about it is that it uses a system, Megaladon, which is pretty obscure. There isn't that much documentation or examples, but it can still formalize an advanced undergraduate textbook.
From the article it's hard to tell if Cantor really did plagiarize (though it seems Dedekind thought he did).
According to the article, Cantor proved the theorem first and sent it to Dedekind. Dedekind suggested a simplification of the proof, which Cantor used when he wrote it up. The story doesn't make Cantor look good, but if the original proof by Cantor is correct, then the credit for the theorem still basically belongs to Cantor.
If I understand the article correctly, that second proof was published as a rider on a first proof that was entirely Dedekind's. So, there was definitely a credit owed at time of publishing.
I came away with the impression that the biggest villain in this story was Kronecker. Without the need to tiptoe around his ego and gatekeeping, these results may have been published as a paper with joint authorship.
I read it the other way. Here's the quote from the article:
On December 7, 1873, he wrote to Dedekind that he thought he’d finally succeeded: “But if I should be deceiving myself, I should certainly find no more indulgent judge than you.” He laid out his proof. But it was unwieldy, convoluted. Dedekind replied with a way to simplify Cantor’s proof, building a clearer argument without losing any rigor or accuracy. Meanwhile Cantor, before he’d received Dedekind’s letter, sent him a similar idea for how to streamline the proof, though he hadn’t worked out the details the way Dedekind had.
"Dedekind quickly replied that...he’d worked out a proof that the algebraic numbers (the numbers you get as solutions to algebra problems) could be counted.
[...]
Weierstrass had been most excited about the proof that algebraic numbers are countable. (He would later use that result to prove a theorem of his own.) So Cantor chose a misleading title [for his paper] that only mentioned algebraic numbers.
[...]
Writing his paper, Cantor put the proof about algebraic numbers first. Below it, he added his own proof that the real numbers cannot be counted — Dedekind’s simplified version of it, that is."
So the first proof -- the one the article was titled after -- was completely created by Dedekind.
> he’d worked out a proof that the algebraic numbers (the numbers you get as solutions to algebra problems) could be counted
I can't say I'm fully comfortable with that characterization of the algebraic numbers. The definition itself does suggest a proof that they are countable:
1. The number of symbols that can appear in a well-defined algebra problem is finite. (For example, if we define algebra problems as being posed in written English, we can use an inventory of no more than 50 symbols to define them all. If we define "algebra problems" in some other way, the definition will specify how many symbols are available.)
2. The number of possible strings describing algebra problems, created from this finite symbolic alphabet, is necessarily countable, because the strings have finite length.
3. Each algebraic number is the solution to one of those strings, and therefore the algebraic numbers are countable.
But I don't really feel like it's possible to learn anything about the numbers from that proof.
Yeah, that was a weird way to describe the algebraic numbers. The formal definition (those numbers for which a finite polynomial p exists with p(x)=0) is not that complicated, is it?
You can also get to computable numbers through a similar argument, substituting something Turing-complete for algebra. You definitely do get to learn some interesting things about numbers from computable numbers. The differences between the computables and the full reals are much more subtle than the differences between the rationals and the reals.
> The differences between the computables and the full reals are much more subtle than the differences between the rationals and the reals.
How so?
Using the definition of computable numbers where you provide input of n and the output is every digit of the number up to n places past the decimal point, we can rephrase that definition like so:
A computable number c is one with the following property:
A Turing machine exists which, provided with a tolerance δ, will exhibit a rational number q < c such that c - q < δ
Clearly, a suitable rational will always exist, since rationals can be found within any distance of any real.
But for some particular real, we might not be able to find that rational through the use of a fixed Turing machine, in which case the real would be noncomputable. This suggests to me that there is a wider gap between the computables and the reals, where the approximation of a real number is limited by the need to describe it with a Turing machine, than there is between the rationals and the reals, where we can use the same approximation, but without that limitation.
(Obviously the rationals are a subset of the computables, but if we're considering a relationship to the real numbers, the rationals seem to have one that is closer and more direct...? The relationship of a computable number to a real number is defined through intermediary rational numbers.)
"This suggests to me that there is a wider gap between the computables and the reals"
I didn't say "wider". I said more subtle. It doesn't take much mathematical intuition and training to understand the rationals versus the reals. Understanding the computables versus the reals is a lot more tricky and takes a lot more thought. The simple arguments that show the difference between the rationals and the reals require a lot of very careful adjustment if you want to translate them to the computables versus the reals.
That the credit for the theorem belongs to Cantor is not under question. This is acknolwedged in the article:
>The revelation about Cantor’s result doesn’t undermine his legacy. He was still the first person to prove that there are more real numbers than whole ones, which is what ultimately opened up infinity to study.
What he is alleged to have plagiarised are the proofs, or at least one of the proofs. The original article by Goos [0] contains a lot more details about this, including a partial transcription of the letter by Dedekind that Cantor is accused of plagiarism. The story is complex.
1. Cantor's paper has two theorems: the countability of algebraic numbers and the uncountability of reals.
2. The proof of the former appears in Dedekind's letter, and Cantor acknowledges this in his response to the letter. Dedekind mentions in his letter that he only thought about proving this because of Cantor's prompt and only wrote it with the hope of helping Cantor. Dedekind felt that the proof by Cantor is "word for word" his, although it is quite the case. It is essentially the same proof though.
Cantor also felt that Dedekind's proof that the set of algebraic numbers is countable is essentially the same as his own proof of the countability of tuples. It remains that he didn't think of adapting that proof himself, and that Dedekind was the first to prove the theorem is not under question.
3. Dedekind was not the first to prove the uncountability of real numbers. However, he gave a number of ideas to Cantor in that same letter. Namely, he suggested proving the uncountability of the interval (0,1), and it seems that gave a pointer towards how to build the diagonalisation argument, although how this statement was useful to Cantor (page 76 of Goos' paper) escapes me.
EDIT: it's not a pointer to the diagonalisation argument, it is an argument why proving the theorem on (0,1) is enough.
4. Cantor proved the uncountability of reals shortly afterwards, and shared his proof with Dedekind. Dedekind simplified the proof in his reply, and Cantor seems to have come up with a similar simplification on his own. None of these letters are analysed in Goos' article.
5. Cantor published the two theorems; the first proof is essentially the same as Dedekin's, and the second proof is possibly the one Dedekind's simplified version of Cantor's. Dedekind is not acknowledged at all in that paper, due to academic politics.
Goos' paper is very detailed and quite readable. I recommend it. The site is pretty annoying and you can't download the article without creating an account, but you can read the article online.
Even if the most important theorem of the two is unquestionably creditable to Cantor, the first one should likewise unquestionably be credited to Dedekind, at least partially. This is where the accusation of plagiarism stems from.
Beyond the question on plagiarism, there is no question that Cantor and Dedekind worked together on this. The lack of acknowledgement by Cantor is certainly quite unfortunate.
It's best practice to say something like "Noether's real story is recounted in my book [link]". This both establishes you as a subject matter expert, and stops your comments looking like disingenuous grift.
It's literally cited in his bio, and he's using his real name on HN. It's about as far as grift as it could be. If he's being curt, he's probably (rightfully) frustrated that "journalists" are getting such bottom-of-the-barrel facts wrong.
It is not about better or worse, it is about correcting myths created later on that were intended to paint the Church as epitome of backwardness.
Galileo's affair wasn't about noble scientist going against stupid masses and oppressive institution designed to keep people in dark, while providing strong evidence for revolutionary theory, and being punished for his great genius.
Agreed. I'd also say that I think our habit of canonizing whoever happens to be perceived as the 'good guy' in history, and demonizing the 'bad guy' tends to make history much more difficult to learn from, because the people involved go from being real humans to actors in a very artificial Hollywood style story of good vs evil.
The real story here is one that has played out endlessly in history in various contexts. And is a great example of why The Golden Rule is something valuable to abide, even if you're completely self centered. It also emphasizes that all people, even the Pope, are human - and subject to the same insecurities, pettiness, and other weaknesses as every other human. And more. It's a tale of humanity that has and will continue to repeat indefinitely.
But when you turn it into a story of good vs evil, you lose all of this and instead get a pointless attack on one institution, which is largely incidental to what happened. For instance you can see the Galileo story clearly in the tale of Billy Mitchell [1] who went from suggesting that air forces would dominate the future of warfare (back in 1919!) to getting court martialed and 'retired' for his way of trying to argue for such. His views would go on to be shown to be 100% correct in 1937, the first time a plane downed a capital naval ship. However, he died in 1936.
Galileo is a noble scientist going against a Pope who had his fee-fees hurt, which then banned the truth. It doesn't make the Church any less backward.
Because the Church didn't even have a good theological reason for siding against Galileo. It was a fit of pique.
But people have so completely internalized the idea that truth must bow to power that they think the fact that the Church condemned Galileo's ideas because he was rude somehow exonerates it as an institution.
The patron and professor funds a paper, and it contains claims of proofs that don't exist and ad hominems against the patron. The patron then sabotages the author. Sure, not very professional by the patron, but still understandable.
Cutting it off at "the LangChain" is like if I took the first sentence of your edit and said "look, I think the general" ?? You think the general?