Science, it definitely still works bitches.
‹ Previous1234567143
  • Petey
    Show networks

    Send message
    You know the drill.
    The janitor.
  • Part of the drill involves summoning Pause from his Nth dimensional garret.
  • I'll bite. I can almost pass as a scientist.

    I bring to the table for discussion: LASERS. Specifically, BIG FUCKING LASERS. More precisely, MANY BIG FUCKING LASERS FOCUSED ON TINY LITTLE THINGS. People have been very excited about nuclear fusion for a good 60 years now, and it still doesn't work, whereas everyone should be genuinely excited about Inertial Confinement Fusion. Check out NIF, yo.

  • Skerret
    Show networks
    get tae

    Send message
    cosmia said:

    I'll bite. I can almost pass as a scientist.

    To what degree do you pass as one?  Undergrad, postgrad, mega-grad??

    Also lasers are cool I guess.
    Skerret's posting is ok to trip balls to and read just to experience the ambience but don't expect any content.
    "I'm jealous of sucking major dick!"~ Kernowgaz
  • Where's SpaceGazelle? He was good for the oul' bit o' knowledge now and then. 
  • Undergrad physicist, currently working in a high-power laser facility for a year, so still got a fair way to go. 

    It's less the lasers in themselves that are cool- pretty standard as lasers go, more that they're hopefully only a couple of months away from achieving ignition. As soon as ignition is achieved, the first prototype ICF power plant can be built. That's where things get exciting.
  • Good lord!

    Seriously, only months away?
  • They reckon September for ignition. It'd be years before a prototype plant got up and running, but yeah. One of the guys came over to the lab here and gave a talk on their progress. We got chills. The amount of stuff they have to think about to get it all stable is truly incredible.
  • Putting out more energy than it takes to power the lasers? That's incredible.
  • In principle- won't be harnessed at NIF as there's not the infrastructure in place (lithium blanket to steam engine, pretty much). No returns for a while!

    Pretty picture: image
    That's the target chamber. 200 odd beams input, focused down into two ends of a tiny gold tube. The pulse is in the nanosecond region and the capsule is dropped from the top of the chamber. Dropped. Blows my little mind.
  • That there Brian Cox fella did a bit about NIF on Horizon a couple of weeks ago.  

    Here it is:

    (I saw a lecture from Brian Cox once.  I suspect it will be the only time in my life when I witness a lecture on particle physics that is frequently interrupted by the excited screaming of teenage girls.)
  • To use a technical term: Chuffing nora!
  • Skerret
    Show networks
    get tae

    Send message
    I saw a story on this very thing, last year I think.  Also, Petey's done well.
    Skerret's posting is ok to trip balls to and read just to experience the ambience but don't expect any content.
    "I'm jealous of sucking major dick!"~ Kernowgaz
  • He gave a lecture here- I fell asleep. I haven't seen that, I'll give it a watch.
  • Skerret
    Show networks
    get tae

    Send message
    So he still has some work to do. I gave a lecture on Tuesday; not one sleeper, my record remained intact. I'll see if I can find the doco thing, pretty sure it was on this topic.
    Skerret's posting is ok to trip balls to and read just to experience the ambience but don't expect any content.
    "I'm jealous of sucking major dick!"~ Kernowgaz
  • Show networks
    Not Wii - 3DS: 0146-8922-2426

    Send message
    I've definitely seen two, one with the keyboard player from D:Ream and one not.
  • I have a New Scientist subscription and I have been meaning to post some interesting articles for a while now, so here it goes...

    The Price Of Wealth
    Psychologists now have evidence that money breeds greed and kills empathy. Knowing how could help solve social ills
    Editorial: "Rich suffer as well as the poor in unequal society"
    THE idea that money changes people for the worse is deeply ingrained in western culture. From A Christmas Carol, with its archetypal miser Scrooge, to Wall Street with its ruthless anti-hero Gordon Gekko, countless stories have featured individuals who forsake compassion as they amass their fortunes. More recently, the press has taken to vilifying bankers for awarding themselves huge bonuses while taking excessive risks with investments.
    But what is the truth behind the clichés? Do riches really breed selfishness and greed at the expense of empathy and compassion? If so, why? Although researchers have explored many of the ramifications of class and wealth since the birth of social science in the late 19th century, only recently have they started to look in detail at the way money shapes our ability to relate to other people. The results are surprising, offering a picture of the impact of wealth on our psychology that goes far beyond the usual stereotypes. Understand these effects, and you get a better handle on the other inequalities marking the vast gulf in health and well-being that separates the rich and poor. It might even help explain our diverse reactions to the current economic crisis.
    Dacher Keltner at the University of California, Berkeley, has pioneered much of the recent work. He first started to contemplate the link between wealth and empathy after being struck by what he calls "the profound self-interest and social disconnect" shown by Wall Street bankers, while at the same time recalling the generosity of his neighbours growing up in a poor area. Someone going through tough times, he reasoned, needs the help of others to see them through and so becomes more sensitive to the feelings of those around them. For example, if you have less income you may have to rely on friends and neighbours for childcare or travel, and as a result will develop more effective social skills. "If you don't have resources and education, you adapt to the environment - which is more threatening - by turning to other people," says Keltner. "You just have to lean on people." Those with more money, in contrast, can afford to pay less attention to others, which could explain why your well-paid boss is so unsympathetic.
    From these musings, Keltner and Michael Kraus at the University of California, San Francisco, designed a series of experiments to test whether people from different social backgrounds really do interact differently. In one of their earliest studies, they divided about 100 volunteers into pairs, and then filmed each pair meeting and getting acquainted for 5 minutes. To make sure that their own expectations couldn't sway their interpretation of the behaviour, Keltner and Kraus asked two independent observers to view the resulting videos and rate each participant's actions during the exchange, by counting how often they showed signs of interest such as nodding, laughter and eye contact, compared with more detached behaviours such as doodling.
    In line with Keltner's theory, the poorer subjects were more likely to use warmer and more expressive body language and gestures that signal engagement, while the richer participants were more stand-offish (Psychological Science, vol 20, p 99). "Those from the wealthiest families would go directly to their cellphones to check the time, or they would fiddle with their backpack to make sure it was in order," says Kraus.
    The team suspected that these different styles of interaction might have reflected the participants' ability to judge another person's feelings. To find out if wealth can influence empathy, the researchers first asked 200 university employees, with jobs ranging from administrative support to managerial positions, to rate the emotions expressed in 20 photographs of human faces - a standard test of emotional intelligence. As predicted, those with the more prestigious jobs were consistently worse at the task.
    In another experiment, the team divided a group of students into pairs and asked them to act out mock interviews - one student as the potential employer, one as the would-be employee. Afterwards, they were asked to rate their feelings, such as excitement, hope or worry, using a 10-point scale. They also had to estimate the scores of their partners. Once again, the students from poorer backgrounds were better at guessing their partner's feelings than those from wealthier backgrounds (Psychological Science, vol 21, p 1716).
    Importantly, Keltner and Kraus have found that these differences were fluid, changing with the participant's perception of their position within a group. When asked to imagine a conversation with someone they deemed to be higher up the social ladder, the wealthier participants became immediately better at reading emotions. The team concluded that the observed effects are probably automatic reactions that lead us to become more vigilant and mindful of others when we feel subordinate.
    Keen to investigate the way in which wealth might influence other behaviours, the team turned to an experiment designed to test altruism, in which each participant has to decide how to divide a reward with an anonymous partner who is supposedly sitting in another room. Despite being poorer, people from less-privileged backgrounds tended to give more than those higher on the social ladder. Similar results emerged from an online survey and game (Journal of Personality and Social Psychology, vol 99, p 771).
    This selfish tendency on the part of the better-off seems to translate to all kinds of situations, with laboratory and real-world experiments revealing many instances in which wealthier people are more likely to behave unethically than those from poorer backgrounds. For instance, Keltner's latest study has found that richer people are more likely to commit an offence while driving, eat sweets that are intended for children, or cheat to increase their chances of winning a prize (Proceedings of the National Academy of Sciences, vol 109, p 4086).
    Taken together, the results provide some preliminary support for Keltner's theory. However, it may be best to reserve judgement until someone tests the apparent behavioural differences in more true-to-life settings, says Linda Gallo at San Diego State University, California. She points out that many of the experiments have been conducted in university labs - and people might not be as empathic as Keltner's studies suggest if tested "in situ" in tougher, deprived areas. It is also possible that the choice of participants, who were mostly students, doesn't reflect the rest of the population. If so, they wouldn't be the first experiments to have been skewed by a relatively narrow sample; psychologists are becoming increasingly concerned about studies that rely on educated subjects in western, industrialised countries to draw conclusions about humanity (New Scientist, 13 November 2010, p 42)artx_video.gif.
    Yet Hazel Rose Markus at Stanford University in California, who studies the effects of culture on behaviour, has also found that social and financial success can make people less caring. She suggests that the differences may arise from the sheer range of opportunities afforded by wealth - the rich spend more time considering how to spend their fortune than worrying about the needs of others, she thinks (Social Psychological and Personality Science, vol 2, p 33). "The conditions of life of those in the professional middle class focus their attention on themselves and their own needs, interests and choices, which makes them less caring," she says.
    Markus suspects that psychological differences may help to explain some of the other inequalities between the rich and poor. Consider a few districts in London; the average male life expectancy is 88 years in one particularly well-heeled district in the borough of Kensington and Chelsea, as compared with 71 in one of the poorest areas, Tottenham Green in the north of the city.
    Part of the explanation for this is straightforward: money can buy a better diet, a gym membership and better healthcare. Furthermore, wealthier people are more likely to have a better education, which leads to less physically stressful and more rewarding jobs.
    Poorer people have less of all of this, and they also have the stress of knowing they are low down in the pecking order. There is now much evidence that this tension compounds the impact of a less comfortable lifestyle. For instance, the extensive Whitehall studies, which examined the health of British civil servants over decades, found a very clear link between illness and job grade. Those lower down the hierarchy were more likely to have cardiovascular and respiratory disease and to die younger than those in more senior positions, even after other social and economic factors had been taken into account.
    Richard Wilkinson, who studies the social determinants of health at the University of Nottingham, UK, has described the distress caused by social inequality as equivalent "to more rapid ageing", because it "compromises the immune and cardiovascular systems and increases our vulnerability to so many diseases".
    Emotional double hitWhere do the recent findings come into this? Social interaction is meant to be vital to mental and physical well-being, so you might expect the closer social ties in poorer communities to mitigate these stresses. Yet the increased empathy may in fact amplify the burden, by making people more acutely aware of their lowly position on the economic ladder. "My hunch is that the increased empathy of the working class does not buffer them from stress but rather adds to the stress," says Markus.
    Her intuition finds some support in one of Kraus's most recent series of experiments. He placed pairs of participants in slightly tense social situations, in which they were encouraged to create amusing nicknames for one another. Rating their emotions before and after the exchange, those from poorer backgrounds tended to show a greater dip in their mood - suggesting they are more sensitive to perceived social slights (Personality and Social Psychology Bulletin, vol 37, p 1376). "This is one of the negative consequences of being empathic in a context that is profoundly unfair," says Kraus.
    It adds up to a double whammy of disadvantage: not only do the worse off face poorer resources and opportunities; they are also more attuned to the injustice of their situation, which may contribute to higher levels of anxiety, hopelessness and depression - and, as a result, ill health. Gallo agrees that "a low sense of control and self-esteem and high levels of negative emotions such as depression and hostility help to explain why individuals with low socio-economic status have worse physical health".
    The more self-centred mindset that comes with riches might also have a profound effect on someone's political opinions. When the team asked university students to explain increasing economic inequality in American society, those from poorer backgrounds thought it due to political influence or disparities in educational opportunities. Those from wealthier backgrounds put it down to hard work or talent (Journal of Personality and Social Psychology, vol 97, p 992). In other words, poorer people, who must rely much more on others to get by, are more aware of contextual or social factors that might contribute to someone's circumstances, while those with the social and financial resources to go it alone consider that life is what you make it.
    On one level, this seems predictable: wealthy people want to feel they deserve their high income, and no one who is hard up wants to hold themselves responsible. But such perceptions may have important consequences when it comes to politics. Although the links between wealth, personality and political opinion are difficult to disentangle, it's plausible that the reduced empathy that comes with wealth and success may contribute to a more conservative, right-wing position aimed at preserving the interests of the rich.
    Dishearteningly, Keltner's research might also suggest that the money and prestige of high office could degrade the altruistic tendencies of even the most well-meaning politicians. "A government run by wealthy, educated people is going to be interested in maintaining the current social order," says Kraus. "[Its members] will not be interested in the welfare of everybody, but in the welfare of themselves and their own goals."
    More generally, the work could be seen to undermine "trickle-down economics": the notion that money made or inherited by rich people will end up benefiting poorer individuals, through the creation of new businesses that provide jobs for middle or low-income earners, for example. This argument is often made in support of tax cuts for the wealthy. Yet if the rich do create more jobs as a result, Keltner's findings suggest they will be more concerned with preserving their own interests, by awarding themselves hefty bonuses, for instance, rather than creating a constructive working environment with fair wages for all. "Our results say you cannot rely on the wealthy to give back, to fix all the problems in society," Keltner says. "It is improbable, psychologically."
    Fortunately, not everyone seems to be corrupted by the trappings of success - as many instances of generous philanthropy attest (New Scientist, 24 September 2011, p 36). And although Kraus and Keltner's experiments may seem to offer a pessimistic view for those hoping to achieve greater social equality, they do at least suggest that the tendencies aren't set in stone, and that under the right circumstances, the well-off can be encouraged to become more empathic.
    "If you can make them aware of those things, you can shift their self-interest," says Kraus. Future research will no doubt offer some suggestions for the best approach - although it will probably take more than psychological trickery to open the eyes of many dyed-in-the-wool politicians.
    Michael Bond is a New Scientist consultant in London
    Live= sgt pantyfire    PSN= pantyfire
  • E. O. Wilson: from altruism to a new Enlightenment[ul][li]24 April 2012 by Liz Else[/li][li]Magazine issue 2861. Subscribe and save[/li][/ul]
    Groundbreaking sociobiologist E. O. Wilson argues that group selection is the main driver of evolution and explains why we need a new intellectual revolution
    You've recently been involved in a high-profile academic row over what drives the evolution of social traits such as altruism. Why should non-specialists care?
    That is one of the main points of my new book. Scientific advances are now good enough for us to address coherently questions of where we came from and what we are. But to do so, we need to answer two more fundamental questions. The first is why advanced social life exists in the first place and has occurred so rarely. The second is what are the driving forces that brought it into existence.
    Eusociality, where some individuals reduce their own reproductive potential to raise others' offspring, is what underpins the most advanced form of social organisation and the dominance of social insects and humans. One of the key ideas to explain this has been kin selection theory or inclusive fitness, which argues that individuals cooperate according to how they are related. I have had doubts about it for quite a while. Standard natural selection is simpler and superior. Humans originated by multilevel selection - individual selection interacting with group selection, or tribe competing against tribe. We need to understand a great deal more about that.
    How will a better understanding of multilevel selection help?
    We should consider ourselves as a product of these two interacting and often competing levels of evolutionary selection. Individual versus group selection results in a mix of altruism and selfishness, of virtue and sin, among the members of a society. If we look at it that way, then we have what appears to be a pretty straightforward answer as to why conflicted emotions are at the very foundation of human existence. I think that also explains why we never seem to be able to work things out satisfactorily, particularly internationally.
    So it comes down to a conflict between individual and group-selected traits?
    Yes. And you can see this especially in the difficulty of harmonising different religions. We ought to recognise that religious strife is not the consequence of differences among people. It's about conflicts between creation stories. We have bizarre creation myths and each is characterised by assuring believers that theirs is the correct story, and that therefore they are superior in every sense to people who belong to other religions. This feeds into our tribalistic tendencies to form groups, occupy territories and react fiercely to any intrusion or threat to ourselves, our tribe and our special creation story. Such intense instincts could arise in evolution only by group selection - tribe competing against tribe. For me, the peculiar qualities of faith are a logical outcome of this level of biological organisation.
    Can we do anything to counter our tribalistic instincts?
    I think we are ready to create a more human-centred belief system. I realise I sound like an advocate for science and technology, and maybe I am because we are now in a techno-scientific age. I see no way out of the problems that organised religion and tribalism create other than humans just becoming more honest and fully aware of themselves. Right now we're living in what Carl Sagan correctly termed a demon-haunted world. We have created a Star Wars civilisation but we have Palaeolithic emotions, medieval institutions and godlike technology. That's dangerous.
    Yet at the end of your new book, The Social Conquest of Earth, you seem upbeat, arguing that by the next century we could turn the world into a permanent paradise for humans.
    It's a statement of hope. I was not going to say we have intrinsically screwed up. I had to finish it as a full-blown American optimist saying I think we are going to find our way out of this and we are going to do it with education and science.
    But the clock is ticking...That's right. That is why I'm devoted to the kind of environmentalism that is particularly geared towards the conservation of the living world, the rest of life on Earth, the place we came from. We need to put a lot more attention into that as something that could unify people. Surely one moral precept we can agree on is to stop destroying our birthplace, the only home humanity will ever have.
    Do you believe science will help us in time?
    We can't predict what science is going to come up with, particularly on genuine frontiers like astrophysics. So much can change even within a single decade. A lot more is going to happen when the social sciences finally join the biological sciences: who knows what will come out of that in terms of describing and predicting human behaviour? But there are certain things that are almost common sense that we should not do.
    What sort of things shouldn't we do?
    Continue to put people into space with the idea that this is the destiny of humanity. It makes little sense to continue exploration by sending live astronauts to the moon, and much less to Mars and beyond. It will be far cheaper, and entail no risk to human life, to explore space with robots. It's a commonly stated idea that we can have other planets to live on once we have used this one up. That is nonsense. We can find what we need right here on this planet for almost infinite lengths of time, if we take good care of it.
    What is it important to do now?
    The title of my final chapter is "A New Enlightenment". I think we ought to have another go at the Enlightenment and use that as a common goal to explain and understand ourselves, to take that self-understanding which we so sorely lack as a foundation for what we do in the moral and political realm. This is a wonderful exercise. It is about education, science, evaluating the creative arts, learning to control the fires of organised religion and making a better go of it.
    Could you be more concrete about this new Enlightenment?
    I would like to see us improving education worldwide and putting a lot more emphasis - as some Asian and European countries have - on science and technology as part of basic education. To that end, the E. O. Wilson Biodiversity Foundation - a foundation I was invited to give my name to - has been working with Apple to create an online beginners' course in biology using the best animation techniques.
    We're using 3D animators, educators, multimedia artists trained in science and cinema, and textbook professionals. It's a sort of portal that runs from molecules to ecosystems, from the origin of life to the modern awareness that we control the environment we live in. It will present modern biology in an exciting, compelling way. The first chapters have already been distributed for free by Apple in 32 countries, including the US.
    ProfileE. O. Wilson is professor emeritus at Harvard University. Among his 25 books are the groundbreaking Sociobiology (1975), Consilience(1998), the Pulitzer prizewinners On Human Nature (1978) and The Ants (1990), and the novel Anthill (2010). His latest book is The Social Conquest of Earth (W. W. Norton)
    Live= sgt pantyfire    PSN= pantyfire
  • Network your stuff: Amateurs inventing a new internet[ul][li]25 April 2012 by MacGregor Campbell[/li][li]Magazine issue 2861. Subscribe and save[/li][/ul]

    Video: Watch a toy gun blow bubbles when an email is received

    Thousands are tweaking and wiring up their possessions and homes – these tinkerers may usher in the next stage of the internet
    IF AN email message were a physical object, what would it be? For Ted Hayes, a freelance designer based in New York, it's a soap bubble.
    He decided to connect his inbox to a toy soap-bubble gun using cheap and simple electronics. Hayes had little technical skill at the time, yet he got the contraption to work, spreading bubbles each time he received an email. He says he was pleased that "a normally frivolous and cheap plastic object could become interconnected with the world".
    Hayes represents a growing movement of tinkerers who are merging the online and physical worlds in surprising ways. Instead of waiting for technology companies like Cisco or Apple to make their gadgets, these "makers" are buying off-the-shelf computer chips, sensors and wireless radios, and doing it themselves. They are transforming their possessions - from plant pots and clothing to thermostats or cuddly toys - to become smarter, connected and social.
    At first glance, many of their creations seem amateurish: after all, they are often made with duct tape and cardboard rather than brushed aluminium and glass. But the bigger picture is that this subculture of makers is driving something far more important. If history is a guide, what these people are up to today will shape the next stage of the internet, and transform your relationships with your own possessions and home.
    For more than a decade, the internet has seemingly been on the cusp of a major expansion, in which physical objects become part of it. Today, you access the digital realm from your smartphone or computer, but in principle, adding electronics and antennas to many of your other possessions could make them part of the same network as websites and apps.
    Governments, academics and big technology companies have beenpromoting this vision of an "Internet of Things" for years. Want to know if you are out of milk at home? Use a smartphone app to ask your fridge. Yet despite the hype, the vision has yet to materialise.
    Object hackersOf course, niche successes do exist. Vitality, a company based in Cambridge, Massachusetts, makes a smart pill-bottle that can report your medication habits to your doctor over the web. Some high-end household appliances can also be controlled online. So far, however, no one has found a "killer app" that becomes a regular feature of our day-to-day lives.
    But where established companies are still struggling to figure out how to connect with consumers, a growing community of amateurs is busy creating thousands of smart devices. And some technology observers believe that all this activity is revealing how to build an Internet of Things that people actually want to use.
    Hobbyists have always tinkered with technology, but what is different about this maker movement is that every year, the technical expertise required to participate is dropping. Much of the current activity was spurred by the introduction of an open-source computer called Arduino. In 2005, a group of designers at the Interaction Design Institute Ivrea in Italy needed a simple bit of hardware to help design students who had little technical knowledge to create prototypes of interactive devices. They came up with a credit-card-sized circuit board featuring a single chip-based computer called a micro-controller. Crucially, the board also featured easily identifiable sockets for input and output signals, a small amount of on-board memory and a USB connection that made it easy to program via a regular computer.
    Students could attach sensors for anything from light levels to sound, to how far a material was bent from its original shape. Then, using a simplified programming language, they instructed their devices to control mechanical motors, LEDs or wireless internet connections.
    In the ensuing years, with more than 300,000 units out in the wild, Arduino has become a global DIY phenomenon, giving artists, designers and tinkerers everywhere an accessible way to add interactivity to just about anything.
    Hayes's email bubble gun, for example, is based on Arduino, as is a whole menagerie of other amateur-built devices. Some have even created entire home-automation systems that give them the ability to control lights, gates and doors from a web page.
    Take Darja Gartner, a graphic designer based in Geneva, Switzerland. In October 2010, Gartner and her boyfriend participated in a project called HomeSense, run by Tinker, a London-based design firm. Participants were given an Arduino TinkerKit - a set of sensors and attachments for Arduino that further simplifies the design process - and let loose to figure out how to use the kit to make their homes "smarter".
    The participants varied in technical ability, but were largely successful in making smart objects that had meaning and use for them. One person made a coaster that flashes LEDs to remind him to get up from his desk for a break from work; another outfitted a toy robot to remind people to flush after using the toilet.
    Gartner and her boyfriend got stuck in, creating several gadgets. One turned off the light in the hallway if it had been left on too long, and another watered the plants when the soil moisture got too low.
    Still, the pair also discovered a few limitations of Arduino. They hoped to build a sensor that would help them to stay on good terms with the people in their apartment building. "My neighbours were extremely sensitive about noise, so I thought it would be fun to have a little reminder to tell me when I should lower the music or talk less loudly," she says. They gave it a go, but were not able to make their dreamed-of noise-detector function properly. Arduino may be designed for novices, but programming it still requires wading through lines of code, which is a barrier too high for most people.
    A new crop of devices, however, is removing the need for electronics know-how and programming. One example is called Twine, from a design firm called Supermechanical, based in Cambridge, Massachusetts. Twine is a block of rubber containing a micro-controller, a Wi-Fi radio and internal sensors for acceleration and temperature. It also contains a port by which external sensors for things like moisture can be attached. The whole thing fits easily in the palm of your hand and, crucially, you don't have to know how it works inside or the code required to program it. Twine sells itself with the slogan: "Connect your things to the internet, without a nerd degree". For example, if you wanted to receive a message when your laundry is finished, you might place a Twine on top of the washing machine. Then use a simple app to instruct the Twine to send a text message when it senses that the machine has stopped shaking.
    To gauge interest in the idea, last December Supermechanical posted Twine on the crowdfunding site Kickstarter, asking for $35,000 in seed money. The response was enormous - the two-person company raised more than $500,000. Supermechanical designer John Kestner was surprised, and especially because most of the people who donated and pre-ordered the device are not your typical geeks.
    When Twine is officially launched in June, it will surely find many more creative uses than monitoring a washing machine. "I'd be lying if I said we had a great idea for exactly what the killer application is for the Internet of Things," he says, "but give people the tools to do it themselves and they'll figure out what makes sense for them."
    Another new project removes the need for a traditional computer altogether:LittleBits, based in New York City, has created a set of sensors and widgets that connect to each other via magnetic links. They require zero programming. To make a device you simply connect, say, a pressure sensor module to an LED light module. Press the sensor, the button lights up. The concept is deceptively simple, even toy-like, but founder Ayah Bdeir says the modules can fit serious purposes. She says the company is planning to release new modules with more complex behaviours such as remote control and wireless communication. "There's a limit to what we can do on screens," she says. "LittleBits is trying to add a layer of computation and digital control over the physical objects around us."
    So, given that the enabling hardware is getting simpler every year, it is quite possible that many more people could soon be adding interactivity and connectivity to just about any physical object they own. If that sounds an unlikely prospect, consider that the history of information technology reinforces this idea, and provides lessons for how it will happen.
    Today, people routinely create their own websites, Facebook profiles or blogs - usually by assembling pre-existing chunks and without needing to know the underlying code. Last year, London-based artist and interaction designerAndy Huntington made a lot of people in the technology industry sit up by arguing that there are parallels between the current wave of activity among those making their own smart objects and what happened with the digital internet during the 1990s. Before then, our interactions with information and the media were largely one-way. We watched TV, read newspapers, but could seldom join the conversation.
    That began to change when the tools to shape the internet became widely available, according to Huntington. One particularly influential site at the time was called Geocities. Without much expertise, the users of Geocities could sign up for a page in one of a number of content-themed "neighbourhoods", and were then free to build a web page made up of whatever content they liked. Geocities is now credited with helping to democratise the internet and paving the way for the user-generated, social web we know today. It showed the world that many people wanted to build part of the internet themselves, which was not obvious at the time.
    That is why Huntington calls the smart, networked objects that the makers are creating the "Geocities of Things". Indeed, many parallels are there: Geocities pages were amateurish, splashed with gaudy animations. Maker projects are amateurish, built with cardboard. Geocities allowed anybody to create a webpage. Arduino, Twine and the like allow (almost) anybody to create a smart object.
    So if makers will indeed shape the future Internet of Things, does this recent history provide any clues as to what will happen next? Perhaps. Once Geocities had paved the way for the user-generated internet and Web 2.0, the arrival of MySpace and similar websites then allowed people to join up the web pages they were creating inside social networks. So it is entirely possible that social connections will play a part in driving the growth of the Internet of Things, and perhaps more than many companies and governments currently realise.
    Look at the activity among amateur makers today, and you will find plenty of hints that this could be true. Tom Igoe, a design professor at New York University and Arduino co-founder, notes that once people get the hang of using platforms like Arduino, many tend to gravitate towards building devices that link them to other people. He calls such interactions "remote hugs" - a way of using smart objects to communicate a simple message, such as "I'm thinking of you".
    It is not surprising that people would do this when you consider that many of our possessions represent social connections already. Social scientist Sherry Turkle at the Massachusetts Institute of Technology argues that the objects we own are "evocative" - they anchor memories and sustain relationships. So, a lamp given to you by a loved one, for example, doesn't just give you light: it serves as a reminder of your relationship with them. This social link could be strengthened if the lamp was connected to the internet and periodically communicated a non-verbal message about your loved one, says Matt Ratto, director of the Critical Making Lab at the University of Toronto, Canada. For example, if you so desired, you could program it to dim briefly when they switch off one of the lamps in their home. Admittedly, that particular idea wouldn't be for everybody, but the point is that people can now decide for themselves how to link up their possessions and homes.
    Gartner and her boyfriend are currently wiring up their separate apartments to relay their respective activities to each other, so they'll be alerted when the other gets home, or goes to bed. "We were thinking about how to communicate to each other in more subtle ways than sending texts," she says. Hayes's email soap-bubble gun is arguably another example of such communication, giving physical form to a piece of digital social interaction. "To me, what's more important than the things or what they do is how our relationships are changed through them," says Igoe.
    Some people have already begun to commercialise ideas along these lines. For example, Antony Evans, a strategy consultant turned hardware-hacking entrepreneur based in San Francisco, taught himself to use Arduino and created a shirt that can send a text message if its accelerometer registers that the wearer - such as a grandparent - has fallen down. He rounded up the support of Silicon Valley investors and is currently working to bring the product to market. "Transforming a pile of parts into a blinking and working piece of hardware is a magical experience," he says.
    Indeed, it is almost certain that the current generation of makers will produce ideas with wider uses than just inside their own houses. After all, there are now thousands of people in thousands of homes all over the world tweaking and hacking their possessions. With so many people innovating, some of the smart objects they build will surely go on to have a wider societal impact.
    So whereas the Internet of Things as originally envisioned felt polished and perhaps a little sterile, the reality is likely to look quite different. It may look amateurish at times - it may even look like a plastic toy gun that blows soap bubbles - but this world of connected objects will be totally human.
    MacGregor Campbell is a consultant for New Scientist based in Portland, Oregon
    Live= sgt pantyfire    PSN= pantyfire
  • Wearable muscle suit makes heavy lifting a cinch[ul][li]23 April 2012 by Rob Gilhooly, Tokyo[/li][li]Magazine issue 2861. Subscribe and save[/li][/ul]
    A lightweight exoskeleton will allow the elderly to move around more easily. New Scientist heads to a Japanese laboratory to try it on for size
    I'M IN a lab in downtown Tokyo full of grinning engineering students, who are peering past PC monitors and half-completed gadgets to watch me try and lift 40 kilograms of rice. No mean feat, but luckily I am about to be given a power boost.
    I shuffle between some boxes and squat down as instructed by research student Hideyuki Umehara, aware of the clutter around me as I fight for floor space with the lower half of a mannequin, an electric wheelchair and an eerily realistic robotic head. Umehara places the bag of rice onto my outstretched arms. Then he presses a switch on the rucksack-like jacket I'm wearing, my hips are propelled forward and gradually my legs straighten until I'm completely upright.
    It takes a second to register, but the 40 kg of rice I just picked up like a human forklift truck suddenly seem as light as a feather. Thanks to the "muscle suit" Umehara slipped onto my back prior to the exercise, I feel completely empowered. Fixed at the hips and shoulders by a padded waistband and straps, and extending part-way down the side of my legs, the exoskeleton has an A-shaped aluminium frame and sleeves that rotate freely at elbow and shoulder joints.
    It weighs 9.2 kg, but the burst of air that Umehara injected into four artificial muscles attached on the back of the frame make both jacket and rice feel virtually weightless.
    The muscle suit is one of a series of cybernetic exoskeletons developed by Hiroshi Kobayashi's team at the Tokyo University of Science in Japan. Scheduled for commercial release early next year, the wearable robot takes two forms: one augmenting the arms and back that is aimed at areas of commerce where heavy lifting is required. The other, a lighter, 5 kg version, will target the nursing industry to assist in lifting people in and out of bed, for example.
    Kobayashi's muscle suit is the latest in a long line of exoskeletons dating back to General Electric's 1965 "man amplifier", the Hardiman. In the intervening years there have been a number of attempts to build devices that augment performance for soldiers, or to help disabled people. Some successful creations, such as the HULC by Ekso Bionics and Raytheon's XOS2, are still in development for the military.
    Yet many exoskeleton projects hit problems early on that delayed or prevented commercial release. Most relate to the inability to generate sufficient power to safely drive the multiple motors required to mobilise the often-hefty suits.
    Kobayashi believes his suit will be different. It doesn't have heavy electric actuators and hydraulics, but instead comes with PAMs - pneumatic artificial muscles. These lightweight, mesh-encased rubber bladders are designed to contract when pressurised air is pumped in. The PAMs give up to 30 kg of instant support or more, depending on how far the weight is away from the body. "The power-to-weight ratio is 400 times greater than motor-driven suits," says Kobayashi, who adds that unlike motors, PAMs are unaffected by water and dirt. A regulator controls the compressed air output based on a signal given by a microprocessor, which in turn communicates with an acceleration sensor in the frame that detects and responds to movement.
    As well as its high power-to-weight ratio, the muscle suit's huge advantage, Kobayashi says, are its simple controls, which are largely preprogrammed to mimic natural human movements. Walking or lifting are triggered via the jacket's sensor, which responds to both simple voice commands, such as "start or "stop", and the body's acceleration. If the wearer is standing upright or moving more slowly than the preset acceleration threshold then the device will not move. A simple dial can control the suit's speed. The exoskeleton will be available to rent from ¥15,000 (£115) per month, although Japan's health insurance will cover 90 per cent of the charge in many cases.
    "Years ago I was attracted by cool-looking robots, but basically they were of little use to society," Kobayashi said from his office, which is decorated with achievement awards and houses the prototype for his best-known creation,Saya the humanoid robot teacher. "I think our muscle suit is the only practically usable tool worldwide."
    In Japan there has been a surge in R&D into exoskeletons, largely because of the country's rapidly ageing population: more than 30 per cent may be over 65 by 2025. In a recent science and technology white paper the government emphasised the need for robotic devices in a society where increasingly "the elderly will be caring for the elderly".
    Later that day, I get the chance to try out the simpler version of the suit, which has no metal sleeves to support the arm. It is noticeably lighter, though the final product, says Umehara, will be lighter still, weighing around 4 kg. "I always thought this was part of fiction," he says, "but now, it's just a step away."
    Let the muscle suit take the strainExoskeletons won't just help you lift heavy stuff, you'll also be able to hold it for longer. Hiroshi Kobayashi's team at the Tokyo University of Science, Japan, is measuring muscle fatigue using near-infrared spectroscopy, to gauge the benefits of their "muscle suit". Results show that continuous muscle use without the exoskeleton produces an increase in oxygenated haemoglobin or "oxy" and a decrease in deoxygenated haemoglobin or "doxy", which indicates muscle fatigue. The difference between oxy and doxy when using the muscle suit was negligible, Kobayashi said. The team expect to present their work at the International Conference on Intelligent Robots and Systems to be held in Portugal, in October.
    Live= sgt pantyfire    PSN= pantyfire
  • Rich seams of Earth's secret moons[ul][li]24 April 2012 by Stuart Clark[/li][li]Magazine issue 2861. Subscribe and save[/li][li]For similar stories, visit the Solar System Topic Guide[/li][/ul]
    Right now, a mini-moon is probably orbiting our planet. It could reveal the history of the solar system – and make some people very rich
    Update: A new company backed by Google founder Larry Page and film director James Cameron will later today unveil its plans to mine near-Earth asteroids for precious metals, such as platinum. Mining on an asteroid is fraught with difficulty due to the space rock's low gravity and the fact that it is hurtling through space. The job is made much easier if the asteroid is captured by Earth's gravity and becomes a temporary mini-moon. Here we describe how such asteroids have been captured and the potential riches they hold.
    See gallery: "Roughnecks in space: Moon mining in science fiction"
    ALMOST a century ago, something strange split the sky across North America. On 9 February 1913, eyewitnesses reported dozens of burning fireballs cutting a swathe across the night sky. It was a display unlike any other meteor shower. Instead of shooting stars raining down in all directions, a train of bright fireballs moved slowly and deliberately over much of the continent.
    The first sighting was in Saskatchewan, Canada. Burning red-hot from its passage through the atmosphere and trailing streaks of vapour, the meteor train moved south-east, passing just a few kilometres north of New York and then out over the Atlantic Ocean. Final sightings of the spectacle came from Bermuda and a steamer ship near the equator.
    The distance between the first and last observing points was 9200 kilometres. To be seen over such an expanse, the meteors must have been in orbit around our planet. The conclusion was compelling: what people had seen that night was probably the break up of a small, previously undiscovered moon of Earth.
    We are now realising that the events of 1913 may not be unique. Computer models of asteroid orbits are showing that small space rocks a few metres across can lodge in Earth's gravitational field if they stray too close. Only a tiny fraction of them break up and hit our planet. Most orbit unseen for months or years, somewhere beyond the moon, before slipping safely and silently back into deep space. But while they remain close, they are mini-moons of Earth. Not only are they turning out to be more common than anyone thought, they could play a vital role in unravelling our solar system's secrets.
    It is not unheard of for a planet to capture a small celestial object. Jupiter is a master of the art: it is 320 times more massive than Earth, and also orbits five times farther away from the sun. At that distance, the sun's gravity is much weaker, so Jupiter can wrestle objects away from it and clutch its prey more tightly. Jupiter's most notable recent catch was comet Shoemaker-Levy 9. The giant planet's gravity subsequently pulled the comet to pieces and swallowed it in a series of spectacular explosions in July 1994.
    Thankfully for us, Earth's gravity is much weaker, meaning such violent acts are extremely rare. Most of the objects that do make it to Earth originate in the asteroid belt between the orbits of Mars and Jupiter. However, telescopes designed to identify asteroids that may one day smack into Earth have found growing numbers of objects in orbits across the solar system. Most are small, fragments of once larger objects that have broken up in collisions over the aeons. Any such small body that finds itself passing by on an orbit similar to Earth's is likely to be snagged and yanked onto a course that takes it around our planet instead - if only for a short while.
    Moon trackersMikael Granvik at the University of Helsinki in Finland and his colleagues are among those dedicated to tracking down these celestial fly-by-nights. Calculations by Granvik's group show that mini-moons are likely to be a few metres across and orbit slowly at up to 12 times the distance of the moon. The course they chart around Earth is a delicate one because of perturbations in the gravitational field from the sun and other planets (see "Mini-moon or quasi-moon?"). As a result, Granvik's model predicts that most captured objects drift off again, spending on average just 9 months in orbit. Perhaps the biggest surprise, however, is that such temporary mini-moons are common. "There is probably one up there right now," says Granvik.
    Finding a mini-moon is no easy matter because their small size means they reflect little light. Present surveys, such as the one conducted with the Pan-STARRS telescope in Haleakal, Hawaii, are looking for potentially hazardous near-Earth asteroids. But they are not really powerful enough to search for metre-sized mini-moons.
    What Granvik and others do see often turns out to be space junk masquerading as small asteroids. "Out of six objects we have investigated, five have turned out to be upper stages of rockets," says Paul Chodas atNASA's Jet Propulsion Laboratory in Pasadena, California (see "Space junkie").
    That leaves one. On 14 September 2006, the Catalina Sky Survey detected an object in Earth orbit. Designated 2006 RH120, it was calculated to have been captured by Earth in July of that year. As astronomers watched, it made three leisurely orbits over the following 12 months, one of them bringing it inside the orbit of the moon, and then drifted away again.
    This time everything fitted. To distinguish it from space junk, Chodas analysed the body's orbit. Asteroids move primarily under the pull of gravity because they are dense, whereas space junk tends to be hollow and gets pushed around by the pressure of sunlight as well. The weight of evidence pointed to 2006 RH120 being an asteroid some 5 metres across. It is now moving away around the sun in a similar orbit to Earth. By 2017, it should be on the opposite side of the sun from us. Its return visit is likely to take place around 2028.
    Granvik suspects that astronomers have sighted other bona-fide mini-moons but have simply disregarded them. "When observers see an object in Earth orbit," he says, "they tend to think it is just space junk and so throw the data away." He is hopeful that new surveys will lead to the discovery of many more mini-moons. The Large Synoptic Survey Telescope planned for completion on the slopes of Cerro Pachón in Chile, in 2019 is an 8.4-metre telescope that will survey the entire sky once a week looking for asteroids. It should be able to spot mini-moons easily and quickly. "If we do start to routinely find mini-moons in the future, we will have the opportunity to study a population of small asteroids that we have not seen before," says Granvik. "We could easily send spacecraft to them."
    The scientific pay-off of such missions would be large indeed. Asteroids are the leftovers of planet formation. They are the fragments of rock and metal that never managed to coalesce into larger worlds. As such, they hold clues about the way the planets formed, such as the raw ingredients that went into those worlds - including the organic components that Earth managed to cook up into life.
    Fly me to the moonConsidering the great pains that planetary scientists have gone to in recent years to bring back a few specks of dust from an asteroid, a mission to a mini-moon would offer convenience and bounty beyond their wildest dreams. The Japanese space agency's Hayabusa probe had already suffered its fair share of setbacks and delays when it finally blasted off towards asteroid Itokawa in 2003. It was hit by a violent solar storm that damaged its solar panels, reducing its on-board power and slowing the spacecraft down. On arrival at Itokawa in 2005, moves to stabilise the spacecraft failed and communications were lost during the attempted landing. On top of that, the sampling device did not work correctly. Nevertheless, the spacecraft limped home and delivered its precious cargo of asteroid dust in June 2010.
    Instead of a spacecraft taking months or years to journey into space to rendezvous with asteroids, it could be at a mini-moon after only a few weeks, or even days, of travelling time. "At just a few metres across, they are small enough that we could even bring a complete one back to Earth for analysis," says Granvik.
    Beyond the value of the science, there could be other - more lucrative - rewards for bringing a mini-moon down to Earth: precious metals.
    Asteroids come in three basic types. M-types are largely metal and were once at the hearts of now-shattered protoplanets. S-types are stony asteroids but are noticeably rich in metals such as iron, nickel and magnesium. C-types are the most common and are composed of elements in their average cosmic abundances but without the hydrogen and helium gases. Even though C-types are not notably enriched, they still contain enough precious metals to make them extremely valuable if they were brought to Earth.
    The last time people were talking about mining asteroids for mineral resources, the chances are they were wearing a tank top and corduroy flares. It was all part of the Apollo-era optimism about living and working in space - and it collapsed along with NASA's budget sometime in the 1970s. Now the idea, unlike the tank top, is back in fashion.
    The reason for the renewed interest is the steady rise in the price of gold and other base metals during the past decade. Back in 1994, William Hartmann at the Planetary Science Institute in Tucson, Arizona, estimated that a 2-kilometre-wide asteroid would be worth $25 trillion in metal and mineral resources (see diagram). That's enough to pay off the US's $15 trillion national debt, use the loose change to settle up for Greece and still make the investors very rich indeed. "I don't see how you can look at any economic study of Earth and not think about the potential resources of the inner solar system," says Hartmann.
    The trouble is, of course, that such resources are not exactly easy to reach. Mini-moons could change that. Although they are just a thousandth of the diameter of the asteroid that Hartmann used in his 1994 example, they are far easier to reach. "Once Earth has captured these asteroids, they become accessible to us," says Chodas, although he remains unsure about the practicality of mining in space.
    Aerospace engineer Hexi Baoyin and his colleagues at Tsinghua University in Beijing, China, are taking the idea further. Having independently identified how small asteroids can be naturally captured by Earth, they have suggested that some closely approaching asteroids could be nudged into Earth orbit, either by slamming projectiles into them or using more subtle methods like erecting solar sails on them. This, they propose, could be one of the best ways to make mining near-Earth asteroids possible (
    As the world's human population increases, so the demand for resources will grow. Some of Earth's resources are already expensive to mine and this may make asteroids increasingly tempting targets - especially if the cost of space missions drops dramatically thanks to the efforts of private space companies such as SpaceXartx_video.gif in Hawthorne, California. It is one of several companies in which the US government is investing to try to bring the cost of launching each pound (0.45 kilograms) down to below $1000.
    Even so, Chodas is sceptical. "Getting the equipment to these asteroids is expensive, getting the minerals back is even more so," he says. "I'm not sure space travel will ever be cheap enough to make mining asteroids viable."
    Even if asteroid mining never takes off, mini-moons are still a source of wonder. Why spend billions travelling to an asteroid when they are gently knocking on our front door? Unseen they may be, yet as we gaze out into the night sky, there is every reason to think that they are up there, taking turns to orbit our planet, like celestial fruit just waiting to be picked.
    Mini-moon or quasi-moon?Discovered in 1986, asteroid Cruithne shot to fame when astronomers spotted that it takes a year to orbit the sun and never strays too far from Earth. This led to it being dubbed Earth's second moon. But according to Mikael Granvik at the University of Helsinki in Finland, that is stretching things a little too far. Cruithne is an example of a quasi-moon - its path is dictated by the sun's gravity rather than our planet's. "If you took away the Earth, a quasi-moon's orbit would not be affected," says Granvik.
    In contrast, mini-moons are actually in Earth's gravitational clutches - at least for a while. A mini-moon's orbit is determined by a delicate balance between the gravitational fields of Earth, the sun and other celestial bodies. This is what makes them susceptible to falling into orbit around Earth in the first place, and then prone to drift off again.

    Space junkiePaul Chodas studies near-Earth objects, and part of his job is to investigate the reports of possible mini-moons.
    Based at NASA's Jet Propulsion Laboratory in Pasadena, California, Chodas's latest case is asteroid 2010 KQ, discovered in May 2010 by the Catalina Sky Survey in Tucson, Arizona. Just a few metres across, it seemed a dead ringer for a mini-moon. Then infrared observations showed that its composition resembled no known asteroid type, but instead was reminiscent of metal rocket parts. Chodas provided the clinching evidence by showing that 2010 KQ had been very close to Earth in 1975, although its orbit is not known well enough to associate it with a specific rocket launch.
    He was also responsible for the calculations showing that another mini-moon, discovered in September 2002, was probably Apollo 12's upper stage returning to Earth for a lap of honour. It had been left in a loose Earth orbit back in November 1969, had slipped into orbit around the sun, and been temporarily recaptured by Earth during 2002 and 2003.

    Stuart Clark is a consultant to New Scientist. His new novel is The Sensorium of God (Polygon)
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: Where will we explore?[ul][li]06 March 2012 by Anne-Marie Corley[/li][li]Magazine issue 2854. Subscribe and save[/li][li]For similar stories, visit the Solar System and Spaceflight Topic Guides[/li][/ul]
    As we spread into the cosmos, the route we take will be shaped by old and familiar human desires – and perhaps even a dash of religious fervour
    Read more: "100,000 AD: Living in the deep future"
    IT IS an inescapable fact. The destinations we can visit in outer space will always be limited by the technical challenges of travelling the unimaginable distances involved, especially within a human lifespan. Still, that will not be the only factor shaping where our descendants go. The route that they take into the cosmos will be equally driven by age-old human motivations - and perhaps even a dash of religious fervour.
    First, the bad news. Last year, a group of scientists, engineers and futurists assembled in Orlando, Florida, to plot humanity's next era of exploration. The name of the plan was the 100 Year Starship Study. The idea was to begin to work out, over the next century, how to get humans to the nearest stars. You can't fault the idea for ambition, but many of them soon realised that developing the necessary technology was daunting, if not fanciful.
    Neal Pellis of the Universities Space Research Association based in Columbia, Maryland, summed up just how far our fastest spacecraft are from achieving interstellar travel. "The nearest star is Alpha Centauri," he told the 100 Year Starship meeting's participants. "At 25,000 miles per hour, it would take 115,000 years to get there. So this is not a plan."
    Even if we figure out how to travel at the speeds required to arrive at a star in a human lifetime, the energy required to get there is far beyond our means for the foreseeable future. Mark Millis of the Tau Zero Foundation, a space-travel think tank based in Fairview Park, Ohio, says only a tiny proportion of today's global energy output goes towards space flight. If this state of affairs persists while energy production continues to grow at the rate of recent decades, then interstellar missions are at least two to five centuries away, he calculates (
    For the next few centuries, then, if not thousands of years hence, humanity will be largely confined to the solar system. Even reaching destinations closer to home will remain slow going until we find better propulsion systems than chemical rockets, which are like Columbus's ships in terms of speed and technology, says NASA planetary scientist Chris McKay.
    Assuming we achieve the speed boost we need, what routes might we take further into space, and what will drive exploration? Scientists will no doubt continue to send uncrewed probes all over the solar system, but if history is any guide, human exploration and settlement of space will not be driven by scientific curiosity alone.
    Roger Launius, NASA's former chief historian, now senior curator at theSmithsonian National Air and Space Museum in Washington DC, says that whenever people have ventured into unexplored corners of Earth, their motivation has tended to be "God, gold or glory" - in other words, a drive to convert indigenous peoples or escape religious persecution, or to extract wealth or earn fame.
    In search of gloryMuch of human space exploration to date has arguably been motivated by glory. National pride was behind the first crewed space missions and fuelled the colossal investment required to put people on the moon. Political will of the same order will be needed to realise the first Mars walk or human visit to an asteroidartx_video.gif.
    Further down the track, nations or companies may want to be the first to send astronauts to rocky worlds like Saturn's moon Titan, which sports polar lakes of liquid methane. Another tempting expedition would be to Jupiter's moon Europa - especially if the liquid ocean under its surface ice turns out to be home to extreme life forms.
    What about God? Could religious motivation play a role in space travel? Future solar-system explorers will have no local aliens to convert, but religion could conceivably be a reason to flee Earth. In the 17th century, for example, English Puritans risked their lives to settle in America for the sake of practising their beliefs. If the private spaceflight industry provides the meansartx_video.gif, it's not impossible that a religious group might be among the first to populate the moon or a Mars base.
    Nevertheless, the dominant drivers of exploration in our history have been economic ones, Launius says. For a space economy, mining asteroids has been proposed, as has space tourism, but neither's time has come yet. "We have yet to find an economic motive to undertake space activities that would involve humans," Launius says. For example, it's impossible to predict what mineral resources will be important to us mere decades from now (see "Will we run out of resources?"). By the time it becomes viable to mine, say, platinum from asteroids, humanity's demand for that metal may have faded.
    Another lesson of history is that exploration has not always been sustained. Instead, it often happens in fits and starts. Consider how the Vikings ventured into North America a thousand years ago, yet permanent European settlement did not follow for another four centuries. Chinese exploration also went on for centuries but ceased by 1500 or thereabouts.
    "There's nothing inevitable about space travel," says John Logsdon, a space-policy researcher at George Washington University in Washington DC. He suggests that subsequent generations may take a break from exploring deep space or even venturing beyond Earth.
    Indeed, our descendants may well have to come to terms with never having the means or lifespan to reach other stars. For them, the stars will remain tantalising twinkles of light, forever beyond reach.
    Then again, there will always be people, like the delegates to the 100 Year Starship meeting, who will work to keep the dream alive.
    Live= sgt pantyfire    PSN= pantyfire
  • No-waste circular economy is good business – ask China[ul][li]18:16 29 February 2012 by Michael Marshall[/li][li]For similar stories, visit the Finance and Economics Topic Guide[/li][/ul]
    Don't throw out that broken toaster: it's key to our prosperity. Redesigning the economy so that all waste is reused or recycled would be good for business, according to two new reports.
    For centuries the global economy has been linear. Companies extract resources from the environment, turn them into products and sell them to consumers – who eventually throw them out. As a result we are burning through Earth's natural resources and wasting useful materials.
    But it doesn't have to be that way, says Felix Preston of think tank Chatham House in London. Instead, we could have a circular economy in which waste from one product is used in another.
    In "A Global Redesign: Shaping the circular economy", Preston argues that reusing resources makes good business sense now that resource prices are high and volatile. He cites a January report by consultants McKinsey & Company which tries to put a value on the circular economy.
    "Towards the Circular Economy: Economic and business rationale for an accelerated transition" estimates the circular economy could save the European Union $340 to $630 billion per year in materials costs, about 3 per cent of the EU's GDP.
    "The opportunity is enormous," Preston says. "The challenge is how to unlock it."
    However, a company wishing to go circular will face considerable upfront costs, and companies that have invested heavily in the existing system will be reluctant to change. Nevertheless some are pushing forward: for instance Renault's Eco2 cars are designed so that 95 per cent of their mass can be recovered and reused.
    China is already pushing the circular economy. According to its 12th five-year plan – covering 2011-15 – China will "plan, construct and renovate various kinds of industrial parks according to the requirements of the circular economy".
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: Why we'll still be here[ul][li]07 March 2012 by Michael Brooks[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]

    Video: How to survive the next 100,000 years

    Read more: "100,000 AD: Living in the deep future"
    WHAT are the odds we will avoid extinction? In 2008, researchers attending the Global Catastrophic Risk Conference in Oxford, UK, took part in aninformal survey of what they thought were the risks to humanity. They gave humans only a 19 per cent chance of surviving until 2100. Yet when you look more closely, such extreme pessimism is unfounded. Not only will we survive to 2100, it's overwhelmingly likely that we'll survive for at least the next 100,000 years.
    Take calculations by J. Richard Gott, an astrophysicist at Princeton University. Based on 200,000 years of human existence, he estimates we will likely last anywhere from another 5100 to 7.8 million years (New Scientist, 5 September 2007, p 51).
    Fossil evidence is similarly reassuring. Records in the rocks suggest that the average species survival time for mammals is about a million years, though some species survive 10 times as long. It seems there is plenty of time left on our clock. Plus, if you'll excuse the blowing of our own trumpet, we are the cleverest of the mammals.
    Mind you, this could be seen as a problem. Probably the greatest threat to an advanced civilisation is technology that runs out of control; nuclear weapons, bioengineering and nanotechnology have all been cited as bogeymen. But disaster expert Jared Diamond, a geographer at the University of California, Los Angeles, points out that we no longer live in isolated civilisations. Humanity is now a global network of civilisations, with unprecedented access to a diverse, hard-won pool of knowledge already being harnessed for everyone's protection.
    We are also unlikely to be extinguished by a killer virus pandemic. The worst pandemics occur when a new strain of flu virus spreads across the globe. In this scenario people have no immunity, leaving large populations exposed. Four such events have occurred in the last 100 years - the worst, the 1918 flu pandemic, killed less than 6 per cent of the world's population. More will come, but disease-led extinctions of an entire species only occur when the population is confined to a small area, such as an island. A severe outbreak will kill many millions but there is no compelling reason to think any future virus mutations will trigger our total demise.
    More scary is the prospect of a supervolcano eruption. Every 50,000 years or so, a supervolcano somewhere erupts and ejects more than 1000 cubic kilometres of ash. Such events have been linked with crashes in human population. Around 74,000 years ago, Toba erupted in Sumatra.
    Anthropologists have suggested that the event may have reduced the human population of Earth to just a few thousand (New Scientist, 17 April 2010, p 28). But as Bill McGuire, director of the Benfield Hazard Research Centre at University College London, points out, there were many fewer humans then and they were largely confined to the tropics, a geographical concentration that made the eruption's impact much more severe than would be the case with today's widely distributed population. "Wiping out 7 billion people today would be far more difficult," he says.
    Judging by their historical frequency, it is estimated that the chance of a super-eruption in the next 100,000 years is between 10 and 20 per cent. With colossal clouds of ash plunging the surface of Earth into darkness for five or six years, global harvests would be badly hit for long enough to cause loss of life on an unprecedented scale. "The likely death toll would be in the billions," McGuire says. But it would have to happen twice in that timescale for a realistic chance of human extinction. That's not impossible, just statistically extremely unlikely.
    Deep impactThe biggest extinction threats of all come from space. Solar flares, asteroid strikes and bursts of gamma rays from supernova explosions or collapsing stars are what we really need to get through. "Every 300 million years we would expect a gamma-ray burstartx_video.gif or a severe supernova explosion that wipes out most of the ozone layer," says Brian Thomas, an expert on intergalactic hazards based at Washburn University in Topeka, Kansas. The result would be a massive increase in harmful radiation at the Earth's surface and an increased incidence of life-threatening cancers during the decades it would take for the ozone layer to recover. It's impossible to know when such an event might occur.
    Yet these things are so rare that the chance of an extinction event in the next 100,000 years is effectively zero. The same can be said for the threat of a solar flare so powerful that it knocks out all critical infrastructure, because it would take flares 1000 times more powerful than the biggest ever seen. "Can our sun, in its present state, produce such a flare very occasionally? We don't know," says Mike Hapgood, a solar physicist based at the Rutherford Appleton Laboratory in Oxford, UK, and project manager for the European Space Agency's Space Weather Programme. But it remains an unlikely disaster scenario. Which leaves the poster child of disaster movies: the asteroid strike.
    This one will take some luck to avoid. Space is full of rocky debris that acts as an occasional threat to Earth. It is widely believed that the impact of a 15-kilometre-wide asteroid wiped out the dinosaurs 65 million years ago. In any 100,000 year period we can reasonably expect an impact from a 400-metre asteroid that will cause damage equivalent to 10,000 megatonnes of TNT. "Not enough to do in the whole civilisation, but certainly destroy an entire small country like France," says former astronaut Thomas Jones, who co-chairs NASA's Task Force on Planetary Defence.
    Some might argue that without France there is little hope for civilisation anyway, but in reality there is only a 1-in-5 chance of total wipeout. "Global effects come from an impact roughly every 500,000 years, so the odds are about 20 per cent for a catastrophic, civilisation-threatening impact within 100,000 years," Jones says.
    We should probably work on some anti-asteroid measures, but really humans concerned about the longevity of our species can relax: the view from here is fine.
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: What will we be like?[ul][li]29 February 2012 by Graham Lawton[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]
    We're not so different to humans who roamed Earth 30,000 years ago. Will genetic engineering transform us in the long run?
    Read more: "100,000 AD: Living in the deep future"
    THERE'S a famous thought experiment about kidnapping a Cro-Magnon man, bathing and shaving him, dressing him in a suit and putting him on the New York subway. Would anybody bat an eyelid?
    Probably not. Though Cro-Magnons lived about 30,000 years ago, they were to all intents and purposes modern humans. Physically they were perhaps a little more robust, but behaviourally they were indistinguishable from us, give or take the effects of thousands of years of technological progress on our lives.
    We have undoubtedly come a long way since then. A Cro-Magnon in 21st-century New York would recognise almost nothing except for other human beings. But his modern human brain would eventually adjust to the startling new surroundings, much as the Tierra del Fuego native who became known as "Jeremy Button" took to Victorian London after he was brought there in 1830 by Robert FitzRoy, captain of the Beagle.
    Now turn that thought experiment on its head and project it into the deep future. What if somebody alive today could be transported to the equivalent of New York 30,000 years - or even 100,000 years - from now? Even if suitably attired, would they fit in?
    Impossible to say, of course. Just because we've had more than 1000 generations of biological stasis does not mean we can expect thousands more. If you believe some futurists, we will eventually become cyborgs with prostheses in our brains and nanobots racing around our bloodstream.
    Extreme as these technological enhancements may sound, they won't produce changes to our bodies and minds that will be heritable and so alter our fundamental biology. Each generation will have to choose whether or not to become cyborgs, just as people can opt for laser eye surgery today. For our descendants to be radically different from us, we would have to engineer our own genome or wait for an event that has happened only rarely in our evolutionary line.
    One hypothesis to explain the sudden rise in behaviourally modern humans 30,000 to 40,000 years ago is the random appearance of a beneficial genetic mutation, perhaps involved in language. So beneficial in fact, that the mutation swept through the population. Humans without it would have been unable to compete with their more fortunate fellows, and their less fit genomes would have been consigned to the scrap heap of evolution (Evolutionary Anthropology, vol 17, p 267).
    The "great leap forward" mutation, if it ever existed, will probably never be identified as it has completely replaced the version of the gene that preceded it. But we can see signs of similar sweeps that are not yet complete. For example, a mutation in a gene called microcephalin arose around 14,000 years ago and is now carried by 70 per cent of people. It appears to be involved in brain development, though it is not clear what trait it is being selected for since there is no discernible difference between people who carry it and those who don't.
    So it is possible that our descendants could evolve into something similar toHomo sapiens today. But radical change seems a long shot.
    Of course, we could eventually decide to take evolution into our own hands. In principle, we could engineer ourselves into obsolescence by creating a new breed of human that would outcompete ourselves. The most plausible technology for starting down this road is to genetically engineer sperm or eggs, or early embryos, in order to install changes in their genomes that will be passed down the generations. This is just about possible with today's technology, and has been put forward as a way of stamping out genetic diseases such as cystic fibrosis.
    Would we go so far as to put desirable traits in rather than just take bad ones out? Even if it were technically possible to do this, it is doubtful that we would collectively agree such changes on a scale that would alter the course of our evolution - unless, of course, engineered humans were so superior that they obliterated the competition.
    These possibilities cannot be ruled out. Surely the most likely option is that our time traveller will find himself among friends, a species of human fundamentally the same as us but with cooler technology. Deep down, they will still be human.
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: Where will we live?[ul][li]29 February 2012 by Michael Le Page and Jeff Hecht[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]
    Plate tectonics, volcanoes and rising seas will reshape our world. What will it look like and where will we live?
    Read more: "100,000 AD: Living in the deep future"
    FISHING boats in the North Sea bring up some strange things in their nets, from the bones of mammoths to ancient stone tools and weapons. Here and in many other places around the world, we are discovering the remains of human settlements on what is now the seabed. As the world changed after the last ice age, many of our ancestors were forced to abandon their homes. And over the next 1000 years, let alone 100,000, the world is going to change dramatically again, forcing billions of people to find a new place to live.
    Some places would battle to survive even if sea level remained constant. The ancient Egyptian city of Herakleion disappeared beneath the Mediterranean Sea 2000 years ago as the soft sands of the delta it was built on subsided, and the same is happening to modern cities such as New Orleans and Shanghai. In Miami and elsewhere, seas and rivers are eroding the land that cities are built on.
    With a stable climate, it might be possible to save cities like these. But as the world continues to warm, rising sea levels are going to drown many of our coastal cities, along with much farmland. The changing climate will also affect people living well above sea level, making some areas uninhabitable but creating new opportunities elsewhere.
    We don't know exactly how much hotter the world will become. But let's suppose events follow the Intergovernmental Panel on Climate Change's "business as usual" scenario, with greenhouse emissions continuing to grow until 2100 and then declining rapidly. Suppose, too, that we do not attempt any kind of geoengineering.
    The most likely result is that the average global temperature will rise nearly 4 °C above the pre-industrial level around the year 2100, peaking at 5 °C sometime in the 23rd century (though it might well get a lot hotter than this). It will stay hot, too, as it will take 3000 years or so for the planet to cool just 1 °C.
    That might mean that the Greenland ice sheet will be almost gone in 1000 years, with the West Antarctic ice sheet following it into the sea, raising its level by well over 10 metres. That's bad news given that coastal regions are home to much of the world's population, including many rapidly growing megacities. As the sea level rises, billions of people will be displaced.
    At least this will likely be a gradual process, though there may be occasional catastrophes when storm surges overcome flood defences. Large areas of Florida, the East and Gulf coasts of the US, the Netherlands and the UK will eventually be inundated. Some island nations will simply cease to exist and many of the world's greatest cities, including London, New York and Tokyo, will be partly or entirely lost beneath the waves.
    And as the great ice sheet of East Antarctica slowly melts, the sea will rise even higher. For each 1 °C increase in temperature, sea level could eventually rise by 5 to 20 metres. So in 5000 years' time, the sea could be well over 40 metres higher than today.
    Even those living well above sea level may be forced to move. Some regions, including parts of the southern US, may become too dry to support farming or large cities. In other areas, flooding may drive people out.
    Any further warming will cause catastrophic problems. A 7 °C global rise will make some tropical regions so hot and humid that humans will not be able to survive without air conditioning. If the world warms by 11 °C, much of the eastern US, China, Australia and South America, and the entire Indian subcontinent, will become uninhabitable (see map).
    Yet the future will open up alternative places to live. In the far north, what is now barren tundra and taiga could become fertile farmland. New land will also appear as the ice sheets melt.
    A rush to exploit the resources in newly exposed bedrock in Antarctica, for instance, could encourage settlement in its coastal regions (see map). If it stays hot enough for long enough, Antarctica will once again be a lush green continent covered in forests. Elsewhere, pockets of fresh land will rise out of the ocean in the space of hundreds of thousands of years, perhaps ripe for human settlement (see "Land ahoy!").
    At some point our descendants could take control of the global climate. But it will take thousands of years to restore the ice sheets and get sea levels back down. By the time we are in a position to do so, some people may like life just as it is. The proud citizens of the Republic of Antarctica will fight any measure that would lead to their farms and cities being crushed by ice.
    Land Ahoy!New lands will rise from the sea. It's time to start composing their national anthems
    Throughout history, explorers have planted their flags on virgin lands. Today, there's almost nowhere left on Earth where we haven't set foot - but that won't always be the case.
    Plate tectonics and volcanism are continually creating new land. For example, future settlers are likely to find Hawaii has an extra island. For more than 80 million years, a "hot spot" of rising magma from deep within the Earth has punched through the floor of the Pacific Ocean to build a series of islands on the crust moving over it. This means Hawaii's Big Island will soon get a baby brother off its south coast, formed by a submerged volcano called Lo'ihi. It is growing fast and should emerge within 100,000 years, depending on sea-level rise. Geologists expect that its peak will eventually tower above all others in the Hawaiian chain.
    In the much longer term, Europe and Africa could also get swathes of new territory. That's because Africa is moving north-east by about 2.5 centimetres a year, gaining about a centimetre a year on Europe, which is moving in the same direction. In principle, this crunching could shut the Strait of Gibraltar within the next few million years. Without the inflow of Atlantic water, the Mediterranean Sea would eventually evaporate. Countries in southern Europe and on the north African coast would effectively expand across the newly exposed seabed until they join up.
    If our descendants are still around millions of years from now, they may have to figure out how to divvy up whole new parts of the world.
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: Will there be any nature left?[ul][li]29 February 2012 by Michael Marshall[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]
    We are causing a mass extinction event. What species do we stand to lose in the coming millennia, and what new creatures will emerge?
    Read more: "100,000 AD: Living in the deep future"
    ON THE face of it, the future of the natural world looks grim. Humans are causing a mass extinction that will be among the worst in Earth's history. Wilderness is being razed and we are filling the air, water and land with pollution.
    The bottom line is that, barring a radical shift in human behaviour, our distant descendants will live in a world severely depleted of nature's wonders.
    Biodiversity, in particular, will be hit hard. Assessments of the state of affairs make consistently depressing reading. Almost a fifth of vertebrates are classed as threatened, meaning there is a significant chance that those species will die out within 50 years.
    The main cause is habitat destruction, but human-made climate change will be increasingly important. One much-discussed model estimates that between 15 and 37 per cent of species will be "committed to extinction" by 2050 (Nature, vol 427, p 145) as a result of warming.
    "It will be a new world," says Kate Jones at the Institute of Zoology in London, UK. The ecosystem will become much simpler, dominated by a small number of widespread, populous species. Among animals that are "incompatible" with humans - we may like hunting them or colonising their habitat, for example - few will survive. "I don't have much hope for blue macaws, pandas, rhinos or tigers," Jones says.
    Ultimately, though, life will recover: it always has. The mass extinctions of the past offer hints as to how the ecosystem will eventually bounce back, says Mike Benton at the University of Bristol, UK. The two that we know most about are the end-Permian extinction 252 million years ago, which wiped out 80 per cent of species, and the less severe end-Cretaceous extinction 65 million years ago, which famously took out the dinosaurs. The Permian extinction is more relevant because it was caused by massive global warming, but Benton cautions that the world was very different then, so today's mass extinction will not play out in quite the same way.
    Recoveries usually have two stages. If ours pans out in the same way, the first 2 to 3 million years will be dominated by fast-reproducing, short-lived "disaster taxa". These will rapidly give rise to new species and bring the world's species count back up (Proceedings of the Royal Society B, vol 275, p 759).
    But a lot of things will still be missing. Ecosystems will be simple, with similar species doing similar things. Herbivores will be less diverse, and top predators may be absent altogether in many places.
    That's where longer-lived, slower-evolving species come in to restore the full complexity of the ecosystem. But this can take up to 10 million years, much longer than even the most optimistic projections of the human future (Proceedings of the National Academy of Sciences, vol 105, p 11536).
    It doesn't have to be like that. We can take action now to get the recovery going, although we don't know how much we can accelerate it.
    Conservation biologists are increasingly thinking the unthinkable, such as relocating species to places where they can thrive while abandoning them to their fate in their native ranges.
    That may seem unnatural, but given that human influence has already touched almost every ecosystem on Earth, is "natural" even a useful concept any more?
    Even more radically, we might be better off encouraging the formation of new species and ecosystems rather than struggling to save existing species that have no long-term future, like pandas. "There's no way I'd want to get rid of them," says Jones, "but things do change and adapt and die."
    Benton says the most important thing is to rebuild biodiversity hotspots such as rainforests and coral reefs. That needn't be a gargantuan task. A recent analysis suggests that damaged wetlands can be restored within two human generations (PLoS Biology, vol 10, p e1001248).
    Beyond that it may be possible to start "evolutionary engineering". For instance we could divide a species into two separate habitats and leave them to evolve separately, or introduce "founder" species into newly rebuilt ecosystems.
    Nature may solve the problem for us by providing founder species from an unexpected source. Animals such as pigeons, rats and foxes are already flourishing alongside humans and may well give rise to new species, becoming the founders of the new ecosystem.
    If you are disturbed by the prospect of a world colonised by armies of rapidly evolving rats and pigeons, look away now.
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: How will our language evolve?[ul][li]29 February 2012 by David Robson[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]
    Given the rapid change in language in just a few millennia, what will it be like tens of thousands of years from now?
    Read more: "100,000 AD: Living in the deep future"
    SHOULD your descendants uncover this page, yellowed and curling, thousands of years from now, many of these words will be incomprehensible - even if they call themselves speakers of English. After all, we struggle to decipher old English texts like Beowulf. You might be able to understand the hero's declaration that "Béowulf is mín nama", but a millennium of language evolution has washed away the meaning from "grimma gaést Grendel" - the "ghastly demon Grendel".
    If our language has transformed almost beyond recognition in just 1000 years, how might it sound in tens of thousands of years? Languages are largely shaped by the unpredictable whims of their speakers, but by examining the forces facing our language, we can speculate about how our descendants might speak.
    The most obvious question is whether they will be using English at all. Although English is the world's lingua franca, its popularity largely hinges on the present economic importance of Anglophone countries. Should another country come to dominate world trade, our descendents may all be learning its language. If so, it's likely that they would begin to incorporate some of its terms into their own language - in the same way that Italians say that they will listen to a "podcast" on their "tablet" at the "weekend". But very popular languages tend to be resilient to invasion, so there's no reason to think that English will disappear entirely.
    It's more likely that it will splinter and fragment. We can already see new dialects forming in many of the UK's former colonial territories, such as Singapore and Jamaica. Thanks to immigration, the internet and mass media, words from such dialects often feed back through the English-speaking world - as can be seen in Jamaican variations that are now sweeping through London slang, such as the use of "buff" to mean attractive, and "batty" to mean a person's bottom. Given enough time, these dialects might diverge entirely. If so, English may end up like Latin - dead, but survived by numerous offspring.
    Do such grand transformations make it impossible to predict anything specific about future English? Certainly, the language is changing quickly enough as it is; the Oxford English Dictionary adds between 2000 and 2500 words each year, says its senior assistant editor Denny Hilton. But there may be thousands of new words that fail to catch the attention of the OED's lexicographers. When Erez Lieberman Aiden and Jean-Baptiste Michel at Harvard University studied Google's corpus of digitised books from the last century, they found around 8500 new words entering the language every year. Many of these are rarely used - words like postcoarctation, reptating and subsectoral.
    Use it or lose itBy looking at English's journey since Beowulf, we can at least identify trends that might continue. Its future grammar might lack some of the nuances that rule the sentences on this page, for instance. We've already lost many of the rules that governed the language of Beowulf - English nouns no longer have different genders, for instance.
    Today, this ongoing simplification can be seen in the way we use the past tense. There are lots of irregular verbs whose past tenses do not have the more typical "-ed" ending - we say "left" rather than "leaved", for example. But time is slowly taming these irregular verbs, and the effect depends on how common these verbs are. By studying English texts from the last 1000 years, Lieberman Aiden and Michel noticed that the less a verb is used, the more likely it is to become regular. "If a word is rare, we don't always remember if it is irregular," says Lieberman Aiden - so we assume it follows the pattern of more familiar verbs.
    "To wed", which is now used in only very specific contexts, is already in the throes of change. People are beginning to say they are "newly wedded" rather than "newly wed", for example. Others are more stubborn. Having found the way a word's popularity can influence its chances of linguistic change, Lieberman Aiden and Michel started to predict the future lifespan of certain irregular verbs. For instance, given its relative rarity, there is a 50 per cent chance that "slunk" will become "slinked" within 300 years (see diagram).
    "To be" or "to have", which are used in around 1 in 10 sentences, have "half-lives" of nearly 40,000 years (Nature, vol 449, p 713). The researchers speculate that irregular plurals will follow a similar trend - "men" could become "mans", for example - though they haven't tested the idea yet.
    In a similar way, we can predict which words will be ousted by new coinages or terms imported from another language. By examining linguistic evolution across the Indo-European languages, Mark Pagel at the University of Reading, UK, has found that this too depends on a word's frequency - the more common it is, the longer it lingers (Nature, vol 449, p 717). That's partly because we are less likely to use the wrong term if we hear the right term often enough.
    In his forthcoming book, Wired for Culture, Pagel also argues that words have evolved to suit their purpose - if they are common and represent important concepts, they will be short and easy to say (see "Forget fittest, it's survival of the most cultured"). Such words are "highly fit", he says, using a Darwinian analogy. "It's difficult for a new word to dislodge them."
    This can be seen in Beowulf's declaration. "Nama" clearly lingers as "name", a very common word then and now. Numbers, question-words and other simple nouns have similar staying power.
    So, if your descendants do speak a form of English and happen to be reading this page, there's a chance they may find some meaning in simple sentences like "what is your name?" or "I drink water". There's a slim chance they might even comprehend "Hello from the year 2012".
    Live= sgt pantyfire    PSN= pantyfire
  • Deep future: What will our descendants know about us?[ul][li]09 March 2012 by Bob Holmes[/li][li]Magazine issue 2854. Subscribe and save[/li][/ul]
    What clues to the way we live today will archaeologists unearth in the millennia to come? What will endure, and what will fade away?
    Read more: "100,000 AD: Living in the deep future"
    WHEN humans in the far future are piecing together a picture of the primitive civilisation of 2012, archaeology will surely be the best way to go about it. After all, the best libraries, archives and museums can be undone by a single fire, amply illustrated by the fate of the library of Alexandria (see "Where will we live?").
    So what will archaeologists working 100,000 years from now discover about us? Only the luckiest of artefacts will avoid being crushed, scattered, recycled or decomposed. You, personally, will almost certainly leave nothing behind that survives that long. To get a sense of why, just point time's arrow the same distance in the opposite direction. Around 100,000 years ago, anatomically modern humans were just emerging from Africa to populate the world. Most of what we know about them is guesswork, because the only clues that remain are sharp stone tools and a handful of fossils.
    You are especially unlikely to leave your bones behind. Fossilisation is an exceedingly rare event, especially for terrestrial animals like us - though with 7 billion people on the planet, at least a few of us will no doubt achieve lasting fame.
    Luckiest - and rarest - will be the "instant fossils". These form when people or animals die in calcium-rich seasonal ponds and wetlands, or in caves. In both situations, bones can mineralise quickly enough for fossilisation to win the race against decomposition, says Kay Behrensmeyer, a palaeobiologist with the US National Museum of Natural History in Washington DC. One wildebeest toe-bone in southern Kenya soaked up calcium carbonate so quickly that it began to turn to stone within two years of death.
    Future fossil hunters won't be looking for us in graveyards since bodies buried there crumble into dust within a few centuries. Instead, the richest human bonebeds will likely be found in the debris of catastrophic events, such asvolcanic ash or the fine sediments left by the recent tsunamis in Asia, Behrensmeyer says. A few bodies might be mummified in peat bogs or high deserts, but they will decay if conditions change, as is likely over a span of 100,000 years.
    Those same changes will also lay waste to other important clues to our civilisation: our homes. Climate change and rising sea levels are likely to drown coastal cities such as New Orleans and Amsterdam (see "Where will we live?"). In these cases, waves will probably destroy the parts of buildings above ground, and basements and pilings will soon be buried by sediments. While concrete may dissolve over the millennia, archaeologists will recognise the precise rectangular patterns of sand and gravel that remain as a sign of purposeful design. "There is nothing at all in nature like the patterns we make," says Jan Zalasiewicz, a geologist at the University of Leicester, UK.
    Building our own geologyNowhere will these designs be more unmistakable than in our biggest structures. A few human artefacts, such as open-pit mines, are essentially geological features already, and will last for hundreds of thousands of years as testimony to our earth-moving powers. Our largest dams, such as the Hoover dam in the US and China's Three Gorges dam, contain such an immense volume of concrete that some pieces will certainly survive that long, too, says Alexander Rose, executive director of The Long Now Foundation, based in San Francisco, California. A few structures - most notably the Onkalo nuclear waste repository in Olkiluoto, Finland - are even being engineered to survive intact for 100,000 years.
    We have also been busy building another massive legacy that will be the real bumper crop for future archaeologists: our garbageartx_video.gif. The landfill sites where most of our goods eventually end up are almost ideal places for long-term preservation. When full, modern landfills are typically sealed with an impermeable layer of clay, so that the contents quickly become devoid of oxygen, the biggest enemy of preservation. "I think it's fair to say that these sites will remain anaerobic over geological time," says Morton Barlaz at North Carolina State University in Raleigh. Under such conditions, even some organic materials such as natural fabrics and wood are likely to avoid decomposition - though over the millennia they will gradually transform into something resembling peat or soft coal, says landfill expert Jean Bogner of the University of Illinois at Chicago.
    A few materials will be preserved just as they are. We don't make much from stone any more, but a few statues might survive, buried safely away from erosion. Ceramic plates and coffee mugs should last indefinitely, too, just as the potsherds of early human civilisations have. Some metals, such as iron, will corrode quickly, but titanium, stainless steel, gold and others will last much longer. King Tut's gold, after all, looks almost unchanged after 5000 years. "There's no reason to think that wouldn't be the same after 100,000 years," says Rose. Indeed, titanium laptop cases, their insides long since corroded, may end up as one of our civilisation's most lasting artefacts. Who knows - scholars of the future may construct elaborate theories about our religious practices based on these hollow tablets and the apple-shaped figure etched into their surface.
    The fact is that no matter how much we may try to preserve a legacy for future generations, we can never know which aspects of our civilisation will interest our descendants. Today, for example, our study of early humans is informed by Darwin's theories, a perspective that was inconceivable only a century ago. Even if the objects in our museums survive, they will only tell future generations what we thought of ourselves. What they will think of us is something no one reading these words today can fathom.
    Lowly grailThey say diamonds last forever, but you might not expect that alongside those sparkly gems, a future museum may well be showcasing our polystyrene coffee cups.
    That's because the petroleum-derived pellets of polystyrene cannot be biodegraded by any known microorganism. This stuff could last a million years.
    Having said that, in the wild the cups will likely crumble into unrecognisable lumps. And if recent efforts to engineer fungi that can decompose polystyrene succeed, not even these will survive. But protected in a landfill and undisturbed for millennia, the cups could retain their shape sufficiently to allow future archaeologists to deduce what we used them for.
    Live= sgt pantyfire    PSN= pantyfire
  • Impossible explosion: The Buncefield blast explained[ul][li]05 April 2012 by Will Gray[/li][li]Magazine issue 2858. Subscribe and save[/li][/ul]

    Video: Explosion tests simulate Buncefield blast

    Could trees and bushes have been to blame for the force of one of Europe's biggest peacetime explosions? Violent experiments may now have solved this enigma
    "MAKE sure you've put in your ear plugs," warns Vincent Tam as we prepare for one of the largest experimental explosions in the UK's history. We're at a remote test site at RAF Spadeadam in Cumbria and in front of us is a long tunnel of plastic sheeting with a line of fir trees inside. It is also filled with flammable propane gas and the fuse is about to be lit. "Don't blink," says Tam, a fuel explosions specialist at energy company BP. "Watch the ground. You should see the grass rippling towards you with the shock wave. Then brace yourself for the impact..."
    I'm here for what promises to be the culmination of an investigation into the UK's biggest ever peacetime explosion. Just before dawn on Sunday 11 December 2005, a large fuel storage depot at Buncefield in Hertfordshire was ripped apart by a mammoth blast. It damaged buildings up to 8 kilometres away and was audible in France and Belgium. Amazingly there were no deaths, but the explosion and resulting fire left 43 people injured, thousands more had to be evacuated and the damage totalled around £1.5 billion. "The amount of debris scattered everywhere was amazing," recalls David Painter, an inspector for the UK's Health and Safety Executive (HSE), who visited the site a few days later.
    The immediate cause was clear: about 180 tonnes of petrol had leaked from a storage tank and the resulting vapour cloud had caught fire. However, the patterns of damage around the fuel depot were puzzling. And though the blast squashed many metal structures flat, calculations suggested that the vapour cloud should not have been capable of wreaking such havoc. Things didn't add up.
    See more in our picture gallery: "What made that bang? Six mystery blasts"
    After seven years of meticulous detective work, the investigation team now think they have some answers. Understanding this disaster is more than an academic question. Leaks from pipelines, refineries and fuel depots are not uncommon, and a major explosion occurs somewhere in the world every two years or so. Finding out what happened should help prevent them in future. And that is why I'm now watching a line of small trees; if the investigators are correct, this overgrown hedgerow is the key to the Buncefield blast.
    In the days following the explosion, HSE officials scoured the site, interviewed witnesses and replayed footage from CCTV cameras around the depot. They soon worked out that faulty switches in a storage tank had allowed the petrol to overflow. With no wind, the volatile fuel's vapour cloud had blanketed the site. At 6.01 am, an electrical circuit in a pump house sparked and the vapour ignited.
    Beyond that the picture was confusing. In particular, no one was sure why the fuel had exploded with such force. When lit, flammable vapour will usually burn rather than explode. The exception is vapour in a confined space such as a building, where pressure can rise so rapidly that the structure will suddenly burst like a balloon. But at Buncefield the vapour cloud was out in the open.
    In an unconfined cloud of fuel, a flame will typically travel at about 10 metres per second, which is far too slow to create a shock wave. But what if a flame moves much faster? Air and fuel vapour in front would be unable to move aside to allow the hot gases to escape, so pressure would build up along the flame front, creating a strong pressure wave that could damage anything in its path. Could this have happened at Buncefield?
    To investigate, a team of researchers modelled a flame moving at speeds of several hundred metres per second. Their analysis suggested a burning cloud could generate a pressure peak of about 1.5 bars. This is not enough to account for the devastation seen on-site: vehicles, including a brand-new Porsche, looked as if they had been through a crusher, and oil drums and metal cabinets were crumpled like paper. "Just from the car tyres which came off the wheels, you can see the numbers don't match up," says Tam. "Tyres are inflated to 2 bars so you're talking peak pressures of at least 4 or 5 bars."
    So the team tried to recreate this level of damage. They sealed metal boxes and drums like those on the site in a chamber and raised the pressure to 10 bars. This still wasn't enough. It wasn't until they detonated 170 kilograms of TNT that they were able to replicate the damage. "To crush a car to a similar level as in Buncefield we had to place it less than 10 metres from the explosives, where the pressure was about 16 bars," says Tam.
    They concluded that the only way to create this kind of pressure was if the whole cloud had exploded. For that to occur, the flame front would have had to spread through the vapour at supersonic speeds of about 1800 metres per second. At this speed the fuel ahead of the flame is compressed and heated to the point at which it ignites spontaneously - a process called autoignition. Suddenly the reaction becomes self-sustaining and the cloud explodes. But how could the flame accelerate to such speeds?
    Clues came from a catastrophic event, also in the UK, more than 30 years earlier. In 1974 a pipe ruptured at a chemical plant at Flixborough, Lincolnshire, spilling about 40 tonnes of cyclohexane gas. The vapour spread in a layer across the site, ignited and exploded, wrecking the plant and killing 28 people.
    Just as at Buncefield, the cloud should not have exploded. Researchers eventually decided that the culprit, bizarrely, was the plant's jumble of metal tubes and pipe work. They suggested that when the flame reached these structures it wrapped around them. This increased the flame's surface area and helped it burn faster. The pipe work also created turbulence that mixed flame and vapour, increasing the combustion rate and helping the flame accelerate.
    Yet there were few buildings and almost no pipe work at the Buncefield site. Besides, despite evidence from Flixborough, no one had ever proved that these kinds of structures could accelerate a flame to detonation.
    The unexpected force of the blast wasn't the only puzzle. Some lamp posts and metal signs were bent in one direction, but others close by were bent the opposite way. Scratches and damage on trees and walls showed similar patterns. Cars, too, had been blown about in different ways depending on where they were parked. With something like a bent lamp post, it seemed safe to assume the blast was moving in the direction it was leaning, says Mike Johnson from oil and gas consultants GL Noble Denton and one of the lead investigators.
    When the team examined videos of test explosions they noticed that a detonation moving through a cloud of gas can also create a force in the opposite direction. They realised that the high pressure zone just ahead of the detonation acts like a cap, preventing the hot gases behind from expanding. The only way they can escape is in the opposite direction - in other words, a detonation wave can send out a powerful flow of hot gas behind it, rather like the exhaust of a jet engine. Simulations by Tam and a team of engineers at Kingston University in West London, UK, confirmed this mechanism (Journal of Loss Prevention in the Process Industries, vol 24, p 187).
    It turned out that this effect has been seen before, though no one had investigated it in any detail. In one case in 1989, a massive liquefied petroleum gas blast at Ufa in what is now Russia threw trees in some areas towards the epicentre rather than away from it.
    With the blast patterns making more sense, the investigators were finally able to pinpoint where they believed the detonation had begun. The evidence indicated a spot near the junction of two roads that ran alongside the depot (see map). Intriguingly, both roads were lined with shrubs and small trees; could their twigs and branches have behaved like the pipe work at Flixborough and created a supersonic flame capable of triggering an explosion?
    When the team tested this in simulations, their digital hedgerow created enough turbulence to accelerate a flame close to the speed of sound. This is near the point at which detonation should occur, Johnson says.
    Models are one thing, but whether a real hedgerow behaves this way is another matter. That is why we are waiting at RAF Spadeadam, with all eyes fixed on a row of small trees. "Today is one of the defining moments," says project manager Bassam Burgan from the UK Steel Construction Institutewhich is coordinating the research. "If this detonates we know it can happen with vegetation. If it doesn't, there's serious thinking to be done."
    The team had arranged trees into a 100-metre-long hedgerow and enclosed the whole lot in a steel frame covered with plastic sheeting. The tube fills slowly with propane. Finally a radio message confirms the gas has reached the correct concentration. This is it. A flash of fire lights up the tunnel at one end and a bright flame races through the trees... then fades.
    No detonation. No shock wave. Just a disappointed silence from the crowd. "It's like waiting to see the best firework in the world and it just fizzling out," sighs one engineer.
    Wrong treesAfterwards we learn that the flame accelerated through the trees for 10 metres, reaching a top speed of 140 metres per second. But it never came close to exploding. Johnson suggests that though the trees helped to accelerate the flame front, there wasn't enough turbulence to reach detonation, "We know it is possible," says Johnson. "We just haven't got the right trees yet."
    Three months later, in February this year, in a second test with a different set of trees, the propane exploded. "Rather than the flexible pine trees we had last time, we had dense deciduous trees like those at Buncefield," says Burgan. "It's an exciting result."
    A few years ago no one would have predicted that a row of trees and shrubs could make the difference between a serious fire and a catastrophic explosion. But most of the team now back the hedgerow theory. "To explain it any other way you would have to go to some completely new form of flame propagation that generates high pressures," says Johnson. "You would have to step outside what is known about combustion and explosions."
    It now seems that trees may also have played a key role in vapour cloud explosions like that at Port Hudson in Missouri in December 1970 and at Brenham, Texas in 1992. More recently, in October 2009, a near-identical explosion occurred at the Catano oil refinery in San Juan, Puerto Rico. The shock wave - equivalent to a 2.8 magnitude earthquake - tore up a highway and forced the evacuation of 1500 people as the plant burned for three days. Again a leaked vapour cloud ignited and caused heavy damage and there was little pipe work or chimneys on site. Yet there were trees aplenty.
    Back in the UK, spill prevention and leak-monitoring procedures at fuel depots have been updated. In the longer term, the investigation might change the way storage depots, refineries and pipelines are designed, and how the sites are landscaped. Along with conventional safety features like sensors and alarms, site operators may have to rethink the way that trees, hedges and shrubs are positioned. Even structures on nearby commercial developments could help to accelerate a flame, says Johnson.
    There is still plenty to learn about the complexities of combustion, though, and the team's experiments aren't over. Another hedgerow test is planned for 2012, which should help reveal what density of branches constitutes a hazard. Meanwhile, physicists at Imperial College London are investigating whether dust, leaves and dirt on the site could have contributed to the severity of the blast through a poorly understood process called episodic combustion.
    According to this theory, a pressure wave created by rapid combustion kicks up any lightweight material on the ground ahead of the flame front. This stuff heats up and, if combustible, ignites, radiating more heat into the fuel vapour and making the flame jump ahead. Perhaps this process helped accelerate the flame across open areas such as car parks? "The whole site had debris and leaves everywhere, so there was plenty there to make that happen," says Bob Simpson of the HSE.
    These investigations could have consequences far beyond the oil and gas industry. Understanding the way that burning fuel can explode is important for building a novel aircraft propulsion system called a pulse detonation engine, says Elaine Oran, an engineer at the Naval Research labs in Washington DC. These engines use a detonation wave to burn fuel and air and should be far more efficient than conventional designs. In the long term, the knowledge could also help astrophysicists understand why white dwarf stars explode, she says. Known as type Ia supernovae, these explosions begin with subsonic thermonuclear flames that spread across the star and somehow trigger a supersonic detonation. It is thought that turbulence plays a role. "Everything we learn adds up," says Oran. Who would have guessed that a hedgerow might eventually offer insights into one of the most violent events in the universe.
    Will Gray is a freelance writer based in London
    Live= sgt pantyfire    PSN= pantyfire
  • Antibiotics may make you fat[ul][li]28 March 2012 by Jessica Hamzelou[/li][li]Magazine issue 2858. Subscribe and save[/li][/ul]
    Editorial: "Antibiotics are wonder drugs no more"
    THE trillions of bacteria that colonise our guts are in jeopardy. Overusing antibiotics has not only led to the development of dangerous superbugs, but has changed the bacteria that live inside us. Now evidence suggests that new gut floras may be responsible for our expanding waistlines.
    Antibiotic use has been rising for the past 70 years. They are now often prescribed as a precaution for illnesses when the cause has not been confirmed as a bacterial infection. Martin Blaser, a microbiologist at New York University, fears that over-prescribing antibiotics could be harming some communities of "good" bacteria that line your intestines.
    The effects could be long-lasting, too. For example, some antibiotics seem to permanently oust Helicobacter pylorifrom their home in our stomachs. Widespread use of antibiotics has correlated with a fall in the number of people playing host to H. pylori. That might seem like good news since the bug has been linked to stomach cancer and gastric ulcers, both of which have become less common. However, these positive outcomes coincide with a surge in cancers of the oesophagus, attributed to the more acidic environment H. pylori leaves behind when it vacates the stomach (Nature Reviews Microbiology, DOI: 10.1038/nrmicro2245).
    To investigate whether overusing antibiotics could also play a part in the rise of obesity, Blaser's team fed infant mice low doses of penicillin to mimic doses given to farm animals. After 30 weeks, penicillin-fed mice were between 10 and 15 per cent bigger and twice as fat as drug-free mice.
    When the team looked at the mice's gut bacteria, they found that the antibiotic-fed mice had a different complement of bugs to the untreated mice. Low doses of antibiotics had seemingly shifted the balance of certain gut microbes, reducing the numbers of Lactobacillus, which is a "good" bacterium linked to a lower risk of cancer recurrence.
    To confirm that the mice owed their supersize to an altered gut microbiome, the group turned to germ-free mice, which are bred in a sterile environment and have no gut bacteria. Within five weeks of being given gut bacteria from the mice fed antibiotics, the once germ-free mice were 35 per cent larger than mice with a regular microbiota.
    In the initial experiment, the biggest mice were those that had started antibiotic treatment from birth. Even mice that were only given drugs for four weeks ended up as large as mice on antibiotics for the full 30 weeks. This suggests that gut flora may be most vulnerable to disruption in the earliest moments of life, says Blaser.
    Antibiotics used to treat children may also have a detrimental effect on their immune systems, says Blaser. In a separate study in mice, his team mimicked the short courses of higher dose antibiotics that young children tend to receive for infections. The group then investigated whether these pulsed doses were having any effect on helper T-cells - a group of immune cells that secrete chemicals to direct the immune response. They found that the levels of these chemicals were significantly lower in antibiotic-fed mice, suggesting that their immune systems may have become compromised. Blaser presented his findings at the International Human Microbiome Congress in Paris, France, last week.
    Although no one yet knows why certain groups of bacteria may affect weight, Blaser says that we might expect young children exposed to antibiotics to gain weight like the mice. Indeed, similar effects have already been spotted in humans: when Teresa Ajslev and her colleagues at Copenhagen University Hospital in Denmark followed the development of 28,000 babies, they found that those given antibiotics within the first six months of life were more likely to be overweight at age 7, even if their mother had a healthy weight (International Journal of Obesity, DOI: 10.1038/ijo.2011.27).
    What's more, the problem could get even worse for future generations, Blaser says. We think babies first acquire bacteria during birth, when they travel through their mother's vaginal canal or are exposed to hospital environments. A newborn girl treated with antibiotics could grow up with an altered microbiome, and be unable to provide her own children with the missing bacteria.
    Blaser is not the only one to be concerned. Kristine Wylie at the University of Washington in St Louis, Missouri, says it could well be true that antibiotics are contributing to soaring obesity rates. "The pulses of antibiotics really reflect what children are given [in real life]," she says.
    A recent study of 3000-year-old human faeces suggests that the make-up of our microbiomes has changed (see "Look to ancient faeces for obesity cure"). To protect ourselves from disease, we may want to repopulate our guts with the bacteria we evolved with, says Blaser. The best places to investigate the role of bacteria are poorer countries with limited access to antibiotics, he adds. "You'd have to go to developing countries and start hoovering up faeces," says Brett Finlay at the University of British Columbia in Vancouver, Canada.
    "Microbes are not accidental - we have co-evolved with them," says Blaser. "They are useful, but they are changing as a result of lifestyle, and this is changing our disease risk." The answer isn't to stop giving antibiotics, he says. "A lot of what we have to do is research. We need to narrowly treat infections. We need better diagnostics and better therapeutics."
    Look to ancient faeces for obesity cureIf antibiotics are ruining our gut flora, and consequently promoting obesity (see main story) how can we make it right again? One idea is to repopulate our depleted intestines with ancient bacteria, saysMartin Blaser at New York University. Cecil Lewis and his colleagues at the University of Oklahoma in Norman are starting to get a pretty good idea of what these ancient bacteria might be, after studying fossilised faeces.
    Curious to see what a prehistoric microbiome - the collection of bacteria lining our intestines - might have looked like, Lewis and his colleagues collected ancient faeces from soil in caves and directly from the intestines of mummies around North and South America. All of the samples were between 1400 and 3000 years old.
    The team then extracted DNA before comparing its bacterial make-up to known microbiomes of modern Americans, rural African children, primates including chimps and gorillas, and the Tyrolean iceman Ötzi, who lived around 5000 years ago.
    Lewis's team was able to piece together microbiomes for each of the samples. "They do appear to be different," he says. Surprisingly, though, the ancient faeces have more bacterial DNA in common with those of non-human primates and children living in rural Africa than they do with modern, western gut microbiomes. "My first hypothesis would be that chlorinated water and antibiotics fundamentally changed human microbiomes," says Lewis, who presented the findings at the International Human Microbiome Congress in Paris, France, last week.
    "The association between antibiotics and obesity is important to explore," he says. So, should we be repopulating our guts with missing bacteria? "It's too early to tell if that's a good idea," says Lewis. "However, it is certainly an important idea that requires investigation."
    Live= sgt pantyfire    PSN= pantyfire
‹ Previous1234567143

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!