Friday, May 10, 2013
Sunday, March 24, 2013
When Mort Zuckerman, the New York City real-estate and media mogul, lavished $200 million on Columbia University in December to endow the Mortimer B. Zuckerman Mind Brain Behavior Institute, he did so with fanfare suitable to the occasion: the press conference was attended by two Nobel laureates, the president of the university, the mayor, and journalists from some of New York’s major media outlets. Many of the 12 other individual charitable gifts that topped $100 million in the U.S. last year were showered with similar attention: $150 million from Carl Icahn to the Mount Sinai School of Medicine, $125 million from Phil Knight to the Oregon Health & Science University, and $300 million from Paul Allen to the Allen Institute for Brain Science in Seattle, among them. If you scanned the press releases, or drove past the many university buildings, symphony halls, institutes, and stadiums named for their benefactors, or for that matter read the histories of grand giving by the Rockefellers, Carnegies, Stanfords, and Dukes, you would be forgiven for thinking that the story of charity in this country is a story of epic generosity on the part of the American rich.
It is not. One of the most surprising, and perhaps confounding, facts of charity in America is that the people who can least afford to give are the ones who donate the greatest percentage of their income. In 2011, the wealthiest Americans—those with earnings in the top 20 percent—contributed on average 1.3 percent of their income to charity. By comparison, Americans at the base of the income pyramid—those in the bottom 20 percent—donated 3.2 percent of their income. The relative generosity of lower-income Americans is accentuated by the fact that, unlike middle-class and wealthy donors, most of them cannot take advantage of the charitable tax deduction, because they do not itemize deductions on their income-tax returns.
But why? Lower-income Americans are presumably no more intrinsically generous than anyone else. However, some experts have speculated that the wealthy may be less generous—that the personal drive to accumulate wealth may be inconsistent with the idea of communal support. Last year, Paul Piff, a psychologist at UC Berkeley, published research that correlated wealth with an increase in unethical behavior: “While having money doesn’t necessarily make anybody anything,” Piff later told New Yorkmagazine, “the rich are way more likely to prioritize their own self-interests above the interests of other people.” They are, he continued, “more likely to exhibit characteristics that we would stereotypically associate with, say, assholes.” Colorful statements aside, Piff’s research on the giving habits of different social classes—while not directly refuting the asshole theory—suggests that other, more complex factors are at work. In a series of controlled experiments, lower-income people and people who identified themselves as being on a relatively low social rung were consistently more generous with limited goods than upper-class participants were. Notably, though, when both groups were exposed to a sympathy-eliciting video on child poverty, the compassion of the wealthier group began to rise, and the groups’ willingness to help others became almost identical.
If Piff’s research suggests that exposure to need drives generous behavior, could it be that the isolation of wealthy Americans from those in need is a cause of their relative stinginess? Patrick Rooney, the associate dean at the Indiana University School of Philanthropy, told me that greater exposure to and identification with the challenges of meeting basic needs may create “higher empathy” among lower-income donors. His view is supported by a recent study by The Chronicle of Philanthropy, in which researchers analyzed giving habits across all American ZIP codes. Consistent with previous studies, they found that less affluent ZIP codes gave relatively more. Around Washington, D.C., for instance, middle- and lower-income neighborhoods, such as Suitland and Capitol Heights in Prince George’s County, Maryland, gave proportionally more than the tony neighborhoods of Bethesda, Maryland, and McLean, Virginia. But the researchers also found something else: differences in behavior among wealthy households, depending on the type of neighborhood they lived in. Wealthy people who lived in homogeneously affluent areas—areas where more than 40 percent of households earned at least $200,000 a year—were less generous than comparably wealthy people who lived in more socioeconomically diverse surroundings. It seems that insulation from people in need may dampen the charitable impulse.
Wealth affects not only how much money is given but to whom it is given. The poor tend to give to religious organizations and social-service charities, while the wealthy prefer to support colleges and universities, arts organizations, and museums. Of the 50 largest individual gifts to public charities in 2012, 34 went to educational institutions, the vast majority of them colleges and universities, like Harvard, Columbia, and Berkeley, that cater to the nation’s and the world’s elite. Museums and arts organizations such as the Metropolitan Museum of Art received nine of these major gifts, with the remaining donations spread among medical facilities and fashionable charities like the Central Park Conservancy. Not a single one of them went to a social-service organization or to a charity that principally serves the poor and the dispossessed. More gifts in this group went to elite prep schools (one, to the Hackley School in Tarrytown, New York) than to any of our nation’s largest social-service organizations, including United Way, the Salvation Army, and Feeding America (which got, among them, zero).
Underlying our charity system—and our tax code—is the premise that individuals will make better decisions regarding social investments than will our representative government. Other developed countries have a very different arrangement, with significantly higher individual tax rates and stronger social safety nets, and significantly lower charitable-contribution rates. We have always made a virtue of individual philanthropy, and Americans tend to see our large, independent charitable sector as crucial to our country’s public spirit. There is much to admire in our approach to charity, such as the social capital that is built by individual participation and volunteerism. But our charity system is also fundamentally regressive, and works in favor of the institutions of the elite. The pity is, most people still likely believe that, as Michael Bloomberg once said, “there’s a connection between being generous and being successful.” There is a connection, but probably not the one we have supposed.
Sunday, February 3, 2013
If you try to pet a grizzly bear, of pick up a rattle snake, you are not only likely to be attacked, but you are very stupid. Animals are defensive of their homes, and are much more likely to attack if they feel threatened.
Animals that didn’t make the list
Mosquito’s are widely regarded as the most deadly creature on the planet, killing an estimated 3 million people per year, but the mosquito is not the real killer. Malaria is a parasite carried by mosquito’s. Micro-agents such as parasites, viruses, and bacteria are alive and kill millions of humans, but they are not included on this list. Humans are the most deadly animals on the planet. They are also excluded from the list.
Deer can kill people directly, but auto accidents caused by deer kill 130 people per year. Since the deer did not directly kill the person, this is an auto related accidental death.
Bee stings are the largest killer of humans in the U.S. directly caused by animals. An allergic reaction to the venom is bee’s kills 53 people per year. This number is increasing every year due to the aggressive African honey bee that is taking over in Texas.
The Black Widow and Brown Recluse spiders kill 6.5 people per year. They are usually young children that do not get medical attention right away.
Rattlesnakes carry venom that kill 5.5 people per year. Rattlesnake attacks are always defensive. Most rattlesnake related deaths are males between 17 and 27. Alcohol is usually involved which facilitates the venom. I picture a drunk kid on a camping trip trying to mess with the snake, then not seeking medical attention immediately.
Scorpion and centipedes are responsible for 1 death every two years on average. This is due to their remote habitat and inadequate medical care.
Sharks, alligators, and mountain lions are the only U.S. predators that hunt humans in the wild.
The most feared animal is without a doubt the Shark. The Jaws craze has sent a wave of fear across America for the past quarter century. In reality, less that 1 person per year is killed by a shark in the U.S. Hawaii, California, and Florida are the most likely places to be attacked.
While Jaws is purely fictional, two true stories of shark attacks continue to haunt us. 1912 New Jersey attacks killed 5 people over a course of a week. The most amazing thing was that the bull shark responsible for the attacks, traveled up a river and attacked people swimming in a creek 5 miles from the ocean. The other story was actually told in Jaws. The USS Indianapolis was sank in WWII and the survivors were picked off one by one over the next four days by Oceanic White Tip sharks in the open ocean. Of the 900 sailors in the ocean all but 317 were killed.
Mountain Lions are by far the most dangerous land predator in the U.S. While deaths are extremely rare (1 per year) the thought of being stalked, killed, and eaten is horrific. Alligators in Florida have killed 18 people in the last 60 years. The attacks have been increasing in recent years. This increase is attributed to human encroachment into the alligators habitat. Many attacks occur on golf courses, which have been built over drained everglades.
Bear attacks are almost always defensive. Alaska and Yellowstone National Park are the only places in the U.S. where fatal bear attacks usually occur. Grizzly bears are not interested in humans for food except in late fall before hibernation. Less than 1 fatality per year is due to bear attacks.
Pet dogs account for 31 deaths per year in the U.S. The Pit Bull is not a recognized breed of dog. There are many mutts that resemble the pit bull that kill people, so classification is difficult.The Pit bull variety is by far the largest killer of humans, followed by Rottweiler’s and Husky’s. Dozens of different breeds can kill people. Basset Hounds, Beagle’s, Dauschund’s, Labradors, and even Golden retrievers have killed humans.
Wolf deaths usually occur when people bring them home as pets.Three small children have been killed by pet wolves in the past 30 years. In the wild, there has not been a fatal wolf attack in the U.S. since 1888. (Two deaths have occurred in Canada in the past 10 years)
A 12 foot pet Burmese python recently strangled a 2 year old girl to death in Florida.
While it is rare for a python to kill a human, it can happen, so I included it on this list.
Non Native animal attacks
On rare occasions, attacks can occur at the Zoo, or circus. in 2007 a man was killed by a Tiger at the San Francisco Zoo. There have been a few deaths in the U.S. caused by elephants. The chance of dying from an elephant attack in the U.S. is almost impossible. However, elephants kill over 125 people per year mostly in Africa and India.
This is a bit of a different category because the animals usually do not intend to cause injury or death. Rodeo, equestrian, and bull riding deaths occur infrequently related to how many people are exposed to these animals, but they do happen. An average of 20 people per year are killed in horse related accidents, and 3 people are killed by Bulls.
Average Number of Deaths per Year in the U.S
[Via History List]
Friday, January 18, 2013
1. Cadavers for anatomical study, organ donation, bone and tissue
3. Surrogate mothers, egg and sperm donation, abortion, birth control
4. Prostitution, pornography
5. Polygamy, gay marriage, incest
6. Life insurance
8. Interest on loans
9. Payments for athletes
10. Using horse and dog meat
Sunday, January 6, 2013
A New York man put “himself” out there in his efforts to nab the man who stole his phone, posing as a woman online after he discovered the thief had started looking for dates under his name.
Nadal Nirenberg said the problem started when he left his phone in a cab New Year’s Eve. Within 24 hours, he said, he found messages sent from his profile on the dating site OKCupid, which is linked to the phone.
So Nirenberg grabbed a picture of a woman and concocted his own false identity, contacting the man and convincing him to meet in the flesh.
“My best version of talking as a girl as a flirty girl, I should say, is adding winky face emoticons,” Nirenberg said.
Nirenberg got the thief to come to his apartment in Brooklyn, he said, and confronted him on a nearby stairwell, bringing along both some money and some protection when the time came to reclaim his property.
“I put the $20 in his hand to defuse the situation as fast as possible,” Nirenberg said, recapping the encounter. “But I had a hammer in my hand just in case.”
The would-be suitor did at least seem to make an effort, though; Nirenberg pointed out that he came dressed nicely and brought a bottle of wine.
“As he was walking away, I was surprised,” Nirenberg said after getting his phone back. “I said, ‘You smell great though.’
Via [The Raw Story]
Wednesday, January 2, 2013
According to the Congressional Budget Office, the last-minute fiscal cliff deal reached by congressional leaders and President Barack Obama cuts only $15 billion in spending while increasing tax revenues by $620 billion—a 41:1 ratio of tax increases to spending cuts.
When Presidents Ronald Reagan and George H.W. Bush increased taxes in return for spending cuts—cuts that never ultimately came—they did so at ratios of 1:3 and 1:2.
“In 1982, President Reagan was promised $3 in spending cuts for every $1 in tax hikes,” Americans for Tax Reform says of those two incidents. “The tax hikes went through, but the spending cuts did not materialize. President Reagan later said that signing onto this deal was the biggest mistake of his presidency.
"In 1990, President George H.W. Bush agreed to $2 in spending cuts for every $1 in tax hikes. The tax hikes went through, and we are still paying them today. Not a single penny of the promised spending cuts actually happened.”
Tuesday, January 1, 2013
The "fiscal cliff" deal that was designed to save money actually includes $330.3 billion in new spending over the next decade, according to the official estimate the Congressional Budget Office released Tuesday afternoon.
CBO said the bill contains about $25.1 billion in new cuts, but those are swamped by the new spending on extended unemployment benefits for the long-term jobless and other new refundable tax credits that President Obama fought for.
Of those cuts, only $2 billion are scheduled to take effect in 2013.
And CBO also warned that some of the cuts Congress is counting are from programs on which CBO never expected the money to be spent anyway — such as cuts to the Consumer Operated and Oriented Plan, which was part of Mr. Obama's health care law.
All told, the bill deepens the deficit by nearly $4 trillion over the next decade, when the new tax cuts and spending are combined.
The bill also delays by two months the automatic spending cuts slated to take effect Wednesday, with a promise to reduce spending in the future to cover for them.
[Via The Washington Times]