October 7, 2008
Last Revised: May 23, 2012 4:50 PM
The word for this Policy is "utile" and you will look long in hard for a definition unless you have some clue.
I've carried that clue for decades and came to look for the word -0- no find -- then added my "words of clue" and found the word immediately, penned, perhaps first, by famous futurist author, Michael Anissimov.
Michael Anissimov is a Singularity analyst and advocate. He is interested in futurist issues, especially the interrelationships between accelerating change, nanotechnology, transhumanism, and the creation of smarter-than-human intelligence. Michael has spoken on Singularity-related topics to audiences in San Francisco, Las Vegas, Los Angeles, and at Yale University. (source)
Question 1: Tell us about yourself. What is your background, and how did you first learn about the concept of the singularity?
I'm Michael Anissimov, 21 years old as of 2005. I may be relatively young for a futurist, but I grew up reading non-fiction works in areas relevant to the topics I talk about today - nanotechnology, artificial intelligence, bioethics, technological and social change. I've always been interested in science and the future, but what really set me on my present course was when I read Nano by Ed Regis in 1996. The book introduced me to the topic of nanotechnology and the idea of highly accelerated technological change.
Here is an excerpt from an article by him:
Michael Anissimov :: May 2004
Step 1: Seeking Peak Experiences
Ever have a moment in your life that made you feel like jumping for joy, or crying in happiness? Many claim that these are the moments that make life worth living, or at least a lot of what life is about. It's that moment where you finish writing a book, get a big promotion, or share an intimate moment with someone special. How many "typical" days would you give for a single moment like that? Some might say 1, others 10, others even 100 or more. Think about it - in a usual day, we're conscious for around 14 hours. Let's be conservative and suggest that the average John Doe would trade 5 typical days in exchange for a peak experience that lasts 5 minutes. The time ratio is about 1000:1, but many would still prefer the peak experience over the same old stuff. Unique experiences are really valuable to us.
This would imply that most people value life not only for the length of time they experience, but for the special moments that, as I mentioned earlier, "make life worth living". As the stereotypical quote goes, "Life is not measured by the number of breaths we take, but by the moments that take our breath away." Ethicists sometimes quantify such satisfaction as "utility" for the sake of thought experiments; we might say that each 5 minute peak experience is worth a thousand utility points, or "utiles". Correspondingly, each 5 days of typical activity would also count as roughly a thousand utiles, because one would trade one for the other. Although it may make some of us uncomfortable to quantify utility, our brain is unconsciously performing computations accessing the potential utility of choices all the time, and the model is incredibly useful in the psychology of human decision making and the field of ethics. Please bear with me as I make some assumptions about utility values and probabilities. Note that I acknowledge that two different people will not tag everything with the same utility, nor will they necessarily compute utility mathematically.
Following is an example of a typical human's lifelong utility trajectory. It plots utile-moments (u) against time (t). Let's say that the maximum u value reached is around 200 utiles/minute, or 1000 utiles for the 5 minute peak experience described above. For the sake of simplicity, let's assume about a hundred 5-minute peak experiences per typical lifetime. Since peak experiences are so fun to have, much of the activity on "typical" days probably entails setting the groundwork for these experiences to happen; ensuring that one does not starve and so on.
The curve rises as the agent has a series of interesting new experiences, plateaus throughout most of adulthood, and subtly falls off in later years, until death is finally reached. It punctuates through peaks and valleys. Some might strongly associate utility with wealth or frequency of sexual activity, others might see utility in their intellectual pursuits.
Total utility: around 6,000,000 utiles, if we figure a lifespan of 80.
The above article continues, very interestedly. It even takes up "utiles" in other contexts. I am going to use that word as part of an example of risk and how to measure a person's willingness to take risks.
Say that there is a game of chance. This game is absolutely controlled by the laws of probability.
First you might want to find out if the game is honest, or if there are people who will cheat and who might not pay you if YOU win:
A simple analogy will help illustrate this point. Imagine that you are playing poker with 10 people and that you learn that a minority of them is broke and would not pay you if they lose. You don't know, however, who the ones are who won't pay. In this environment, the risk of losing would be too high even if you know that most of the players are perfectly sound financially and would pay up if they lose.
In this environment, any rational card player would stop making bets until the true solvency position of each player is revealed and the bankrupt ones are expelled from the game. Having insolvent players sitting at the table spoils the game. Source: WSJ Oct 9, 2008
Let's assume, however, that there is a completely honest game. If you win, you will get paid.
The game can be set so that the probability of winning in a coin toss is a) 50% b) 40% c) 60%, or, actually any percentage.
Let's look at a game where the odds of winning on your guess about the result of a coin toss are exactly 70%.
In other words, the very simple, and accurate odds of winning when you bet on 100 coin tosses is 70 times.
You might win 73 in one game of 100 tosses, or you might win 69 in another, but it would be virtually impossible to win less than 50 times in this game.
Next it costs, say, $1 to enter in this game and get 100 coin tosses. For every "win" you get TWO pennies.
When you "lose" the toss, you get zero.
The odds are that when you bet $1.00 this way, for 100 coin tosses, you are very likely to win about $1.40, plus or minus a very tiny difference.
The odds would not change were you to bet $100, instead of $1.00.
If you bet $100 for 100 coin tosses, you are very likely to win $140.00, or a little more or a little less.
You could say that for this game, set at this expected chance of winning the $1.00 bet has an "economic utility" ("utile") of $1.40.
But the point is that the utile value of $1.40 might be valid for you in THIS game, but . . .
Likewise, when you bet $100, the expected winnings would be $140.00 AND (perhaps) you could also say that the expected winnings would have "140 utiles" (measure of "economic utility" in some sense).
"Utiles," however, are a very personal matter. One person may say he sees a "utile value" of $150 in this game where the odds are that he will win $140 beause he likes to gamble.
The next fellow doesn't like gambling at all and says that, for him, this game as a utile value of the winnings of $90.00!
Now, how do you feel about getting into the same game, but you have to bet $1000 to get in the game and you win $20.00 for every win, the odds are still that you will win 70% of the coin tosses and you get zero for any loss. Your odds, very mathematical, are that you will win, virtually every time, close to $1,400 when you get your 100 chances to win (and you win 70 times out of 100 tosses).
What is the "utile value" of those winnings?
The question might be rephrased as: "What would YOU say is the "economic utility" of being in that game if you had to pay $1000 to enter?
Keep in mind that the odds of winning in this $1000 game are to win $1,400. The odds do not change because the cost of entering the game changes.
Many people begin to see that they would not be willing to bet $1000 to get in THIS game because the winnings would have to be higher than $1,400 for you to risk $1000 of your (scarce) money.
Another way of stating this is that the likely winnings from this game are $1,400, but the "utiles" from entering into this game are LESS than $1000.
Sometimes money is so scarce that while you would bet a penny to be in some game, with good odds, you wouldn't bet $100 even if the odds of winning were 90% and every win was $2.00. In other words your likely winnings from the $100 bet are $180.00, but the "economic utility of getting into that game is about $10. You might be willing to bet $10 to get in this game because there are no odds large enough for you to be willing to RISK $100.
This is how people think, even if they don't reduce the game to this type of economic analysis.
Put in another context, say that you are suffering from cancer.
One treatment is offered that gives you a 90% chance of complete cure, or death in one month.
The other treatment offered gives you a 90% chance of living five more years, zero chance of death and zero chance of cure? What do you do now?
Most people do not face such choices often, and when they do they do not try to come up with a rational logic for picking an answer.
I recently faced some unpleasant choices:
1. Pay $40,000 with a guarantee of making the problem go away. The problem would have been bankruptcy.
The other choice was:
2. Pay $1,000 with a claim that I could not understand, but where the person "said" that others had used it successfully, and the chances of HIS success were 100% for me.
Add to this scene that I could not do both choices -0- they would conflict.
The risk here, like in many life situations, is that the risk of success is unknown, but CLAIMED to be 100% and there is no way I can study enough to decide if I AGREE that the chances of success are 100%.
Quotes from L. Ron Hubbard are copyright 1994 © by the L. Ron Hubbard Library. All rights reserved.