Toggle menu
122
332
11
3.4K
Information Rating System Wiki
Toggle personal menu
Not logged in
Your IP address will be publicly visible if you make any edits.

Smart people who get things wrong: Difference between revisions

From Information Rating System Wiki
Content deleted Content added
Pete (talk | contribs)
Created page with "{{Main|User education and self-improvement}} Dan asked about how the ratings system can handle smart people who are wrong. Smart people in the ratings system will have outsized influence. After all, we are hoping the ratings system will correctly identify those with talent and give them higher weight to influence society. So when they get things wrong, it’s important. One short answer we’ve looked at in the past: the ratings system gives people pause to examine th..."
 
Pete (talk | contribs)
No edit summary
Line 3: Line 3:
Dan asked about how the [[ratings system]] can handle smart people who are wrong. Smart people in the ratings system will have outsized influence. After all, we are hoping the ratings system will correctly identify those with talent and give them higher weight to influence society. So when they get things wrong, it’s important.
Dan asked about how the [[ratings system]] can handle smart people who are wrong. Smart people in the ratings system will have outsized influence. After all, we are hoping the ratings system will correctly identify those with talent and give them higher weight to influence society. So when they get things wrong, it’s important.


One short answer we’ve looked at in the past: the ratings system gives people pause to examine their assumptions. Opinions are written down and we have advocated written rather than spoken [[debate]] (XXX). It is also why we should pursue tempering mechanisms like those proposed for social media, particularly delays in publishing posted material so folks can retract what they said too hastily. And unlike social media, the ratings system is not cost-free. If you’re wrong, the system should call you out.
One short answer we’ve looked at in the past: the ratings system gives people pause to examine their assumptions. Opinions are written down and we have advocated written rather than spoken [[debate]]. It is also why we should pursue tempering mechanisms like those proposed for social media, particularly delays in publishing posted material so folks can retract what they said too hastily. And unlike social media, the ratings system is not cost-free. If you’re wrong, the system should call you out.


Incidentally, one great thing about the ratings system is that it can identify smart people without being deceived by credentials. A lot of smart people exist in places you wouldn’t expect and without the credentials (eg degrees) you would expect them to have. Conversely a lot of not-so-smart people exist in academia and with fancy degrees.
Incidentally, one great thing about the ratings system is that it can identify smart people without being deceived by credentials. A lot of smart people exist in places you wouldn’t expect and without the credentials (eg degrees) you would expect them to have. Conversely a lot of not-so-smart people exist in academia and with fancy degrees.

Revision as of 18:21, 26 September 2024

Main article: User education and self-improvement

Dan asked about how the ratings system can handle smart people who are wrong. Smart people in the ratings system will have outsized influence. After all, we are hoping the ratings system will correctly identify those with talent and give them higher weight to influence society. So when they get things wrong, it’s important.

One short answer we’ve looked at in the past: the ratings system gives people pause to examine their assumptions. Opinions are written down and we have advocated written rather than spoken debate. It is also why we should pursue tempering mechanisms like those proposed for social media, particularly delays in publishing posted material so folks can retract what they said too hastily. And unlike social media, the ratings system is not cost-free. If you’re wrong, the system should call you out.

Incidentally, one great thing about the ratings system is that it can identify smart people without being deceived by credentials. A lot of smart people exist in places you wouldn’t expect and without the credentials (eg degrees) you would expect them to have. Conversely a lot of not-so-smart people exist in academia and with fancy degrees.

Why is this? In short, it is because institutions don’t primarily reward intelligence or even competence. They reward the ability to get along with others (to put it mildly) and the ability to politic their way among and toward positions of influence (to put it bluntly). This happens in academia along with all other institutions. Institutions have the particular weakness of needing to survive and grow, like living organisms. Therefore, they favor supporters, whatever their objective merits. An eloquent but otherwise mediocre supporter of an institution is often far more noticed and “valuable” than the silent but objectively expert contributor.

The ratings system to be sure, with its more diverse communities, will also be subject to these forces, but less beholden to their ability to contort institutional and individual behavior. The ratings system will be able to take a more dispassionate view of people’s abilities and their intelligence.

One feature we should emphasize again about the software is introspection. We want people to constantly re-examine their beliefs and assumptions. This is why feedback is important. The system should be reminding us what our position is on a subject, how it is changing, and how it fares against the opinion of the rest of the community. If we are an outlier we should know and do the appropriate amount of thinking to confirm our beliefs or do the work to reject them. The ratings system will depend on everyone taking a scientific approach to their own beliefs and modifying them when presented with new evidence. Our education system should be doing this as well.

This type of modification should be done for all beliefs of course, not just scientific ones. Political, philosophical, religious and deeply held moral beliefs should also come under scrutiny. Optimization of society starts with optimization of people.

Let’s discuss why smart people get things wrong. Well, for one, they’re fallible and make the same mistakes everyone does: they are given the wrong information, they have a built-in bias, they have the right info but come to the wrong conclusion (bad logic).

Just like anyone, smart people can be propagandized. I have a friend who graduated, got a job, and moved away back in the 90’s. Given less social interaction than he experienced in school, he started watching a lot of reality TV and, particularly, biased political talk shows. His views changed and many of my arguments with him were based on misinformation or his perception that some issue or another was highly important when it actually had very little impact on everyday life.

This is one thing propaganda does. Even if it doesn’t misinform, it creates the illusion that some issues, particularly minor cultural ones, are much more important than substantive issues. Again, the ratings system can help correct that by providing the community judgement on what is important and what isn’t. In any event, it is clear that propaganda works on smart people too.

Smart people are subject to long-standing societal biases. Relatively recently, a new macroeconomic idea has come to the fore, Modern monetary theory (MMT) which holds that government spending, deficit spending to be clear, is beneficial until it leads to inflation. Since the government can print its own money, it can always pay off any debts it takes on. It is a way to ensure that monetary policy is always geared toward maximum economic output, until the inflation limit is reached. But a typical response to MMT is that deficit spending is inflationary. It’s somewhat of a kneejerk reaction. Even among smart people, and its pretty much only smart people who will discuss MMT, longstanding ideas are hard to dislodge.

Smart people are subject to the "technical trap". They are often technical and specialize in scientific, engineering, or mathematical disciplines. Making progress in a technical subject is rewarding and can lead to cognitive errors, such as extrapolations that are not warranted. A scientist who discovers a problem with global warming data may conclude that global warming itself isn’t real. An analyst might discover some stock market price pattern and conclude that he has uncovered a new way to make money when, in fact, the price pattern is really just a subset of randomness.

Another problem smart people face is the funding trap. You believe what funds you. And smart people often require funding since, although they’re smart, they’re not usually successful business-people. It is difficult to argue with the ideology that pays you. Folks who work in “think tanks” funded by industry groups (eg banking, oil, etc) are essentially paid to promote an ideology using science, which is essentially the reverse of how science is supposed to work.

And folks working for particular industries, even if they are not paid to promote them per se often have difficulty realizing the larger implications of their technical work. Few employees of chemical companies think about the pollution or environmental harm their work causes. They blithely accept the company’s platitudes that what it does is safe, compliant with the law, etc. Few social media tech employees (Facebook, Twitter) grasp the negative implications of these industries. They do what most employees everywhere do: they keep their heads down and go to work every day. And usually, as a matter of cognitive comfort, they reconcile their beliefs with their employers’ interests.

Smart people have a tendency to believe “given” information since their job is often to draw conclusions from it or process the information in some way. They tend not to question the givens, a habit reinforced since their school days solving homework problems starting with “given” information. In industries where products are designed, this problem manifests itself constantly in mistakes made by not reassessing “given” technical information. In many cases this information comes from customers who themselves make mistakes or provide the information without required context.

Smart people tend to get lost in the weeds. Once they make progress, they become invested in their work which obscures larger issues. Here is a classic example. A compressor company was developing a new type of machine to compress liquified natural gas (LNG). The company had a large family of basic designs already worked out for other applications. It usually just grabbed one of those and did some detailed design of the internals to make it work for a customer’s specific application. They did the same here except this was like no usual customer application. They starting with a basic off-the-shelf selection and proceeding to a detailed 3-D design which failed. It was like designing the internals of a new home after choosing an overall area too small to fit the number of people who would live there. The correct approach was to do a comprehensive analysis of alternatives at a lower fidelity of design (1-D), choose the right basic configuration, and proceed to do the detailed design from there.

Smart people tend to magnify the importance of their own work and thoughts. They think their stuff is original or unique when it really isn’t. Because they’re smart, they tend to have insights into fields they aren’t that familiar with. Sometimes the insights are insightful. Many times they’re just already well understood concepts. Some economists, btw, argue that MMT is a good example of this, since it is seen as a reiteration of standard Keynesian economics.

Smart people can be suspicious of things not created by them and very protective of the things they do. Put yourself in the shoes of a manager of an internal engineering team that obviously can’t do the project they’ve been assigned. So you decide to contract it out and give your guys work more familiar to them. Seems like a win for everyone. But the internal team that just lost its project to the contractor might not think so, never mind that they had no hope of succeeding. Smart people are subject to the same emotions everyone else is and being shamed in any way is one of the most powerful.

Smart people really, really, hate being wrong. Just telling them they’re wrong (and why) is usually counterproductive. Ever seen a smart guy who’s wrong double down on their wrong idea? It’s not pretty. They argue, they dissemble, they prevaricate, they try to weasel out of their position using some technicality, etc, etc. It’s best avoided. The best way is to lay a foundation where the smart person realizes they’re wrong and chooses to make an adjustment.

The ratings system offers a way to help with all these “smart people pitfalls”. It can reduce the tempo of debate so it is more considered and nuanced. Written debate, as we’ve stated in the past, is better than verbal debate (which encourages mistakes and plenty of bad behavior). Furthermore, the ratings system can rate people objectively, without getting personalities involved. Folks will know where they stand, what their strengths and weaknesses are, and when decisions are made with that in mind, they will know why. Giving the internal project to a contractor will make sense in this light and avoid the polite but obvious lying that takes place when executing a move like this. For those who tend to think they’ve thought up something new, the community stands at the ready to gently correct them. In fact, it should become second nature for those in the business of creating ideas to run them across the ratings system before investing a lot of time in them. Things like expensive patent searches, done after the fact, can be avoided that way. For those lost in the weeds, the ratings system may not have a direct answer but it should help people stay focused on their real contribution to society. In fact, the ratings system, by its nature has a tendency to optimize human endeavor toward socially productive ends. And, as an information checking system, it can help with those struggling with basic assumptions (givens) and those who extrapolate the importance of what they’ve done. Perhaps the most important thing it can do, however, is expose bias and debunk propaganda.