I think there's a level above that where the words to describe such structure are familiar and readily available and hey guess what? The model understands those too. Just about every pattern has a name. Or a shape. Or an analog or metaphor in other languages or codebases. All work as descriptors.
This presumes that most of this stays encoded as words in our brains: the effort to translate some of these into words might be similar to translating it into code (still words, just very precise).
It's like talking legalese vs plain English; or formal logic vs English. Some people have the formal stuff come more naturally, and then spitting code out is not a burden.
No, it really doesn't presume anything about brains or information encoding. Just points out that there is a level of mastery in which all the techniques and all the forms have names or adequate descriptions. Teachers often attempt to achieve this, to facilitate education.
It's no accident there is an adage from Aristotle in the vein of: "Those who can, do. Those who understand, teach."
So yes, there is a level of mastery that is beyond being able to do a good job of designing and evolving complex systems which enables people to teach others the same skill set.
However, this is a smaller number of practitioners, and most have learned through practice and looking over how more experienced engineers apply their knowledge.
Where I disagree is that this means everybody is equally capable of teaching with words, or that there are no experts who are bad at teaching (humans or directing AI) — this clearly indicates it is not encoded as words for said experts.
It's been pretty clear in my experience that experts tend to be capable of working with the same ideas in many different forms. That's what I would call mastery. It implies "complete" knowledge, which probably means several interrelated encodings with loci in different parts of the brain. Those interrelated encodings will be highly associated, and discerning in an expert. Which implies a high degree of usefulness and specificity in communication. This matches my experience.
It turns out an awful lot of precision (plenty for many things) lives in library and web APIs, documentation, header files and dependency manifests. Language can literally just point at it without repeating it all. Avoidance of mistake through elimination of manual copying in things like actuarial and ballistics tables was what the original computers were built for.
API Glue is the easy and boring part in programming. Nobody really enjoys wiring API A to API B, combining the results and using API C to push it forwards.
Any semi-competent AI Agent can do that with a plan you've written in 5 minutes.
I would love to see an AI try to make sense of GTK API.
I may be wrong, but it seems when people are talking about easy glue code, they’re talking about web services API, not OS API, not graphics or sound API, not file formats libraries,…
I used Sonnet 3.5 over a year ago to decrypt a notoriously shitty local government API to get data out of meetings, votes and discussions.
I know it's a piece of shit API done in the worst possible way on purpose (they don't want openness, but had to fulfill a law that mandates "openness") because I had previously tried to do it manually - twice. I ran out of whisky before I got anything done.
Sonnet _3.5_ almost one-shotted it with just the API "documentation" they had and access to Python and curl.
People have also hooked stuff into proprietary APIs on "smart" devices with zero documentation, just by having an Agent tirelessly run through thousands of permutations to figure it out.
The Cambrian and Eocene reached around +14C compared to today[1]. Two of the warmest periods in Earth's history, granted. But life thrived. Governments, private property ownership, civilization, not as battle tested.
Our bodies won't be able to handle a temperature regime that hot overall. The factor to research is Wet Bulb Temperature Effect. Basically our bodies are like sports cars and keeping our body cool is a challange under high humidity with temperature near our body temp.
UNIVERSITY PARK, Pa. — As climate change nudges the global temperature higher, there is rising interest in the maximum environmental conditions like heat and humidity to which humans can adapt. New Penn State research found that in humid climates, that temperature may be lower than previously thought.
It has been widely believed that a 35°C wet-bulb temperature (equal to 95°F at 100% humidity or 115°F at 50% humidity) was the maximum a human could endure before they could no longer adequately regulate their body temperature, which would potentially cause heat stroke or death over a prolonged exposure.
Wet-bulb temperature is read by a thermometer with a wet wick over its bulb and is affected by humidity and air movement. It represents a humid temperature at which the air is saturated and holds as much moisture as it can in the form of water vapor; a person’s sweat will not evaporate at that skin temperature.
But in their new study, the researchers found that the actual maximum wet-bulb temperature is lower — about 31°C wet-bulb or 87°F at 100% humidity — even for young, healthy subjects. The temperature for older populations, who are more vulnerable to heat, is likely even lower.
It's a problem anywhere that temperatures reach that high. Higher latitudes have colder climates. Hence, not a problem. If it becomes a problem, people move toward the poles. No longer a problem.
Earth would have to experience > +35 to +50C for the poles to be uninhabitable due to heat.
Yes, polar regions are reliably colder than equatorial regions. Lytton, BC hit the temperature you cite for one day on Tuesday, June 29, 2021. That's a sign of warming, and we should expect more warm days than in the past at any given lattitude. But it is not evidence against the general case that polar regions have colder climates than equatorial regions.
This explains something about why I haven't understood casually mentioning 40c+ temps, 34c in Hong Kong with no breeze is about as much as I can handle.
No reason not. It would push human habitable zones into the high mid-latitudes and subpolar regions though. 55–65° N/S would be closest to comfortable temperatures. So, northern Canada and Russia, Greenland, Antarctica.
The mad rush to get there would likely extract a heavy toll.
The main problem is agriculture. If rain patterns get severely disrupted in most of world's current breadbaskets, it takes time to increase production in areas that may now have more favourable climate. During that time lots of people would starve.
Rain patterns and extreme weather events are the things to really worry about. Temperature changes alone can be mostly dealt with by planting different crops.
It’s not just about time to increase production. A lot of crops that grow at lower latitudes won’t have enough time to mature in the short summers of higher latitudes and may suffer from weaker sunlight due to the lower angle resulting in more intervening atmosphere. We might eventually be able to breed or genetically modify crops to be hardier — but the former takes time and the latter requires sufficient remaining civilization and security to support the labs.
No doubt the transition period would likely involve more death than most catastrophes in history. In part because there are simply more people. Available sunlight is also less nearer the poles, which already affects agriculture in places like Greenland. Crops would shift. We'd be more dependent on energy and supplemental light for certain crops. Adjustment would be difficult. But quite a bit of land would still be habitable.
Interesting. Paying close attention to geopolitics lately, it kind of seems like we're already in a slow-motion mad rush to own these places. Remember when Trump almost invaded Greenland?
From what I read recently (and I don't remember where it was), the current thinking is that it wasn't oxygen levels or temperatures, but the lack of predators that let dragonflies grow that big. A big dragonfly is much slower and an easier target. So unless you get rid of birds, you won't have giant dragonflies.
You need high oxygen content in the air though. Insect style circulatory systems aren't efficient enough to get oxygen to the cells without the air having a super high concentration of oxygen to begin with.
Basically like how when people can't breath good you put them on oxygen to keep them alive only getting oxygen into the blood is the bottleneck rather than into the body.
Global cooling could be worse. But the danger from either comes from the speed with which it happens, and inflexible sociopolitical structures, more than the absolute difference in temperature. Rapid change doesn't permit gradual adaptation like relocation to more habitable areas. The danger from the current global warming trend comes from it's incredible rapidity compared to historical trends.
Given time, humans and other animals will move toward the poles or toward the equator to find habitable zones. Put that on a rush schedule and everyone suffers.
>How does growing crops work when it's dark 6 months a year?
Just fine. If the temperature would cooperate.
The land of the midnight sun actually has great yields for the few crops that tolerate the cool temperatures (low ground greens and vegetables basically, not staple grains or fruit). But because the season is so short temperature wise nobody really farms that stuff commercially up there.
> How does growing crops work when it's dark 6 months a year?
Have you noticed that all broadleaved trees and shrubs lose their leaves for half the year in temperate zones already?
Did you not wonder why that is?
They'll be fine. Annual crops are fine. Wildlife is fine if it's got somewhere to migrate to.
Tough for wildlife when there's nowhere to migrate to, though. But what's burning desert in summer might be just about tolerable hot tropics in winter.
The problem is that current tropical species can't handle the alternation of the seasons. You don't get seasons at the equator. Spring/summer/autumn/winter is a temperate-zones thing. Near the equator the sun rises and sets at the same time every day, and there are at best 2 seasons: the dry season, when it never rains, and the wet season, when it rains a lot all the time.
there are no guarantees in life, can look up any random day and see a meteor streaking across the sky and realize that this is the end regardless of "sociopolitical structures".
All that matters is sociotechnological progress to be able to progress further enough to overcome these tests of existence.
> look up any random day and see a meteor streaking across the sky
That's happened rather more times in Earth's history than most folks are comfortable admitting. Tunguska would have leveled any major metropolitan city on the planet. I still think an impact is one of the more likely initiators of the Younger Dryas abrupt cooling and worldwide ~100M sealevel rise ~12,000 years ago. Conspicuously aligned with the oldest surviving traces of city living, agriculture, etc. It's increasingly accepted that a large portion of human history is 100M underwater on the continental shelves, estuaries, and other coastal areas where humans would have liked to live.
It's increasingly accepted that a large portion of human history is 100M underwater on the continental shelves, estuaries, and other coastal areas where humans would have liked to live.
The impact hypothesis for Younger Dryas isn’t really tenable. Among other things, the climate effects of a large bolide impact would be global, whereas Antarctica actually warmed during YD. This “Polar See-saw” pattern is easily explained by a northerly meltwater pulse hypothesis, but not a bolide.
It's possible the sea level rise could have initiated the cooling. But there is much disagreement as to what exactly initiated the de-glaciation which caused the sealevel rise.
I would push back against the idea that intelligence agency behavior changes administration to administration. Looking through history, it's the intelligence agencies which have superior continuity of leadership. Which suggests things about who's directing who.
Bureaucracy in general exhibits that kind of hysteresis. It is like a running average of who has been in charged mixed with a big dose of the culture that people who choose that sort of career create. Ironically, that inertia is considered by political scientists to be a safeguard for democracy.
People need to know about https://en.wikipedia.org/wiki/Parallel_construction as well. The technique is used to shield these secret programs by laundering the information they collect through plausible evidentiary chains.
The number of terrorists who have been caught because they were controlled by a police officer "because they ran a traffic light" (yeah, sure) is wild.
In the EU at some point after every single terrorist attack the terrorists' names were known because they had left their passports in a car they left at the scene. (yeah, sure again).
The really amazing thing is that they don't know the name of the terrorists right away: because the terrorists don't have the passport on themselves apparently. No: they all leave them in the last car they used.
Probably that, by now, terrorists see past terror attacks and think: "Oh, I'm supposed to have my passport with me, but then leave in the last vehicle I'll use before killing people".
In France we had a case where the government tried to bring terrorist charges against someone, the problem is that the police couldn’t materially have seen what they wrote in their report, because their car was too far and the timing didn’t line up correctly. Eventually the policemen invoked confidentiality rules against their mobile phones so that no accurate probing could be made. The judge threw away the terrorist charges anyways, because the facts didn’t warrant it. Since the police knew exact facts without being there, there is a high suspicion they used illegal spying on the people and tried to launder the data.
Having graduated from a police academy, I was greatly surprised that the reason that most criminals (at least in the US) do incredibly stupid things that make it almost trivial to catch them.
In the original Dunning-Kruger paper, one bad guy thought that since rubbing lemon juice on his face made his eyes blurry, he felt that it also made cameras blurry.
I find it amusing to watch sovereign citizen videos. One of their failures is that they think that "law" is magic. All they have to do is utter the correct recipe of magic spells/words/red ink/stamps and they will be able to force the legal system to bend to their wishes.
> they were controlled by a police officer
I'm reminded of the COINTELPRO program run by the FBI in the 1960s. On more than one occasion, every participant in the "terrorist cell" (modern term - the common one in use back then was "subversive group") were FBI informants attempting to implicate the other members of the cell.
There's a reason J. Edgar Hoover held power for 48 years.
Kennedy wanted to "break the CIA into a thousand pieces"[1] and had a trusted brother as Attorney General to help with the task. And we learn 70 years later that Oswald was a CIA asset[2]. It's enough for even a President to sit up and take notice.
Rumors of their exit from dGPU predate Battlemage. So I wouldn't put a ton of credence to them. But Intel's is quite talented at snatching defeat from the jaws of victory.
He just told you. Because overwhelming public evidence supports the claim. Especially the pricing of open weight model inference. Why do you allow a prejudice to overshadow evidence?
reply