India Police investigate pizza deliveries by drone




Police in the Indian city of Mumbai are reportedly looking into why a restaurant started using a drone to deliver pizzas without letting them know.

Francesco’s Pizzeria says it successfully used a remote-controlled four-rotored drone to send an order to a skyscraper about 1.5km (1 mile) away, the Economic Times reports. In a city that’s famous for its snarling traffic jams, the restaurant says drone deliveries could be a green solution that saves on time too. A video the pizzeria put together seems to show footage from one of the test flights.

But the city police now say they’re checking whether the restaurant asked permission from the civil aviation authorities. “As per norms, permission must be taken for flying any such object,” an air traffic control official says. A local police chief told the told the PTI news agency: “We are very sensitive towards anything that flies in the sky with the help of remote control.”

Indian security forces are nervous about the possibility of terror attacks using paragliders or drones, according to sources quoted by IBN Live. But Francesco’s insists the experiment was safe. A source told the Economic Times the drone never went higher than 130m (400ft) to avoid interfering with other traffic, and the craft was never out of the reach of the controller. Last year, Amazon said it was testing unmanned drones for deliveries, but said it could take up to five years for the service to actually start.

Source: WorldNewsChannel and Francesco’s Pizzeria

Army Gen. Martin E. Dempsey bristles when he hears someone use the word drone

ABOARD A U.S. MILITARY AIRCRAFT, May 22, 2014 – Army Gen. Martin E. Dempsey bristles when he hears someone use the word drone.

“You will never hear me use the word ‘drone,’ and you’ll never hear me use the term ‘unmanned aerial systems,’” the chairman of the Joint Chiefs of Staff said today. “Because they are not. They are remotely piloted aircraft.”

Dempsey spoke to Reuters and American Forces Press Service on his way back to Washington from Brussels and the 171st Chiefs of Defense Meeting at NATO headquarters.

The American people seem to have the image of robots “flying around semi-autonomously making their own decisions and conducting kinetic strikes without oversight by responsible human beings,” he said. “It’s not like that at all.”

There are more than 80 people for each remotely piloted vehicle, he said. They operate and maintain the aircraft, and analyze the information gathered. “It’s so important for us to remember that there is a man or woman in the loop,” he said.

And, whether a service member uses a bayonet or a remotely piloted aircraft with a Hellfire missile, “the ethical application of force applies,” Dempsey emphasized.

The law of armed conflict, the principles of war, U.S. ethics and legal bases apply no matter what the weapon, the chairman reiterated. “So, when we introduce remotely piloted aircraft into a theater in a Title 10 role, we apply the same standards,” he said.

The standards are predicated on the near-certainty of the effect — is the weapon going to do what the operators need it to do? Military personnel always assess the risk of collateral damage on people or buildings. And, “we ensure that we are achieving an effect with the appropriate behavior for the United States of America,” Dempsey said.

Remotely piloted aircraft are “a valid, useful and responsible military instrument in the way we use them,” he said. “So long as we continue to think of them that way and so long as we continue to use them in a transparent … ethical way, then I have no concerns about their use.”

(Follow Jim Garamone on Twitter: @garamoneAFPS)

By Jim Garamone

American Forces Press Service

Contact Author

Army Gen. Martin E. Dempsey
Related Sites:
Photo Essay

New Mexico. State Game Commission decided to wait until next month to consider outlawing hunters from using drones

ALBUQUERQUE, N.M. – Look out, Bambi. You’re still in danger from drones in New Mexico.

On May 13, the State Game Commission decided to wait until next month to consider outlawing hunters from using drones in New Mexico.

“We’ll take a good, hard look to see if it’s within our power to get an outright ban,” commission chairman Paul Kienzle said as the seven-member board unanimously voted to table a proposal that would prohibit what scientists call “unmanned aerial systems” from tracking down big game by monitoring their activity from the air.

“It’s a fair chase issue,” Robert Griego, colonel of field operations at the New Mexico Game and Fish Department told New Mexico Watchdog. “That’s what we always want to maintain. We don’t want things to get to the point where it’s just like shooting fish in a barrel.”

“A person can use a drone to find a trophy animal or simply find all the animals and get a head start on other hunters,” said John Crenshaw, board president of the New Mexico Wildlife Federation. “It’s unfair to the hunters and it’s unfair to the game.”

“It’s just wrong,” said Elisabeth Dicharry, an open spaces advocate from Los Lunas who wants to see an outright ban. “All species (such as coyotes and prairie dogs) should not be hunted with drones.”

The federal Airborne Hunting Act prohibits the use of aircraft to track or shoot animals, but there is no federal law covering drones.

The measure tabled 16 May  would make it illegal to use drones “to signal an animal’s location, to harass a game animal or to hunt a protected species observed from a drone within 48 hours.”

“It’s starting to grow nationwide,” said Oscar Simpson, chairman of the New Mexico chapter of Backcountry Hunters and Anglers. “I was talking to some sportsmen people here in New Mexico over the last nine months and I had three of them say their hunts were screwed up because somebody used a drone to move an elk out of the way, or to move it down to where they were.”

New Mexico isn’t the only state to consider outlawing drones for hunting.

Alaska, Colorado and Montana have already passed bans, and a combination of sportsmen’s groups and animal protection agencies are calling for states across the country to join in.

“As the price of drones comes down, they have a lot of potential for abuse,” Crenshaw told commissioners. “It seems the electronics is outrunning the rulemaking.”

While using drones for hunting is under fire, drones have also been used in places such as Africa to protect animals against poachers and to track the movement of herds for wildlife research.

“You’d look at having a research, a law enforcement exception,” Kienzle said after the meeting.

In an unusual twist, in Massachusetts, representatives of the People for the Ethical Treatment of Animals used drones to videotape hunters in the field. PETA said it did so to make sure hunters were obeying the law.

“To me, using drones to monitor wildlife or monitor hunters, that would be harassment,” Simpson said.

The New Mexico Game Commission plans to bring the issue up again when it meets next month in Ruidoso.

Drones aren’t new to New Mexico.

The state is home to the only flight test center approved by the Federal Aviation Administration — the Unmanned Aerial Systems Technical Analysis and Applications Center at New Mexico State University — that flies and tests drones in the southern portion of the state.

Drone deployment has spiked across the country in recent years. They’ve been employed, for example, by highway officials to check traffic conditions and by forest rangers to track forest fires.

Last December, executives unveiled plans to use drones to deliver packages.

But there’s been pushback, with some civil libertarians expressing concern about drones being used as “Big Brother” and invading privacy.

Legislation was introduced in Louisiana and an ordinance put up for a vote in Colorado that went so far as to allow property owners to shoot down drones for trespassing. Neither measure passed.

Here’s more from Simpson:

Oscar Simpson, the chairman of the New Mexico chaper of Backcountry Hunters and Anglers, explains why he wants New Mexico ban the use of drones for hunting. Interview by Rob Nikolewski of New Mexico Watchdog, 5/15/14.

Source: Wathdog 5/16/14



The Conference on Disarmament today heard a statement by its Acting Secretary-General, Michael Møller, as well as presentations…by France on the work of the informal expert meeting on lethal autonomous weapons systems…

FRANCE, presenting information on the informal expert meeting on lethal autonomous weapons systems within the context of the Convention on Conventional Weapons, said there were 30 statements in the general debate, 24 statements in the closing debate, and many statements made during the technical sessions.  This had demonstrated the interest in the emerging topic of lethal autonomous weapons systems.  Many delegations were only just starting to work on this topic and the meetings had allowed an in-depth discussion about it.  The atmosphere was very constructive, showing the desire of all to learn more about this topic.  There were noted debates on the concepts of autonomy, human control and responsibility.  More in-depth consideration of these issues was needed.  There was also a debate on international humanitarian law.  The importance of the Convention on Conventional Weapons was underlined.  This was the first meeting that allowed participants to exchange views on technical, judicial, ethical and military aspects of lethal autonomous weapons systems.  The report reflected the discussions in an objective manner and did not contain any recommendations.



Source: UNOG UN

United Nations. Lethal Autonomous Weapons Systems. Statements Countries

At the 2013 CCW Meeting of High Contracting Parties, a new mandate on lethal autonomous weapons systems (LAWS) was agreed on. The mandate states:

“…Chairperson will convene in 2014 a four-day informal Meeting of Experts, from 13 to 16 May 2014, to discuss the questions related to emerging technologies in the area of lethal autonomous weapons systems, in the context of the objectives and purposes of the Convention. He will, under his own responsibility, submit a report to the 2014 Meeting of the High Contracting Parties to the Convention, objectively reflecting the discussions held.”

The Meeting of Experts was chaired by Ambassador Jean-Hugues Simon-Michel of France.



“…It is crucial, in our view, that any use of a weapon in armed conflict complies with international humanitarian law. Among roboticists and lawyers alike, there is serious doubt that autonomous weapons can ever be programmed in a way to guarantee this compliance. One further consideration: While in the case of a war crime perpetrated by a human actor legal responsibility can be, at least in principle, established, it is fundamentally unclear how to resolve the issue once the autonomous decision of a machine is at the root of the crime…”



Statement By H.E Ambasaddor Pedro Motta Pinto Coelho. Permanent Representative of Brazil to the Conference on Disarment CCW Informal Expert Meeting Lethal Autonomous Systems 13 May 2014

“…The increasing amount of money spent by governments and private sector in reserches in autonomus systems is an unequivocal indicate of a technological trend that cannot be ignored. Many military experts support the idea of usin this new technology in order to maximize the compliance with the IHL, reduce the number of human casualties (combatants and non-combatants) and decrease their military budgets. Other experts sustrain that the use of lethal autonomous systems would imply a “deshumanization of warfare”. They point out that ethical and moral standards require meaningfull human supervision of decisions to take life. They also emphasize that key issues must be urgently addressed, such as the level of automation we aim to achieve and what functions of these lethal systems should not be allowed to operate autonomously...”



Opening Statement of the Republic of Croatia. CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems. Geneva 13 May 2014

“…CCW was always seen as a forum for open-minded discussions and exchange of views on disarmameny issues at the multilateral level. Since we all agree that human suffering resultingg from an armed conflict cannot and must not leave the international community indifferent, we would like to call upon all High Contracting Parties to use this unique opportunity of having a number of globally renowned LAWS experts among us and to give their contribution in this week’s discussions…”


Czech Republic

“…If it has been quite difficult to keep a balance between humanitarian concerns and security requirements in the past, it will be even bigger challenge to strike it within the context of sophisticated autonomous weapons of the future. This is only one of the reasons why we think it is important to start work on the needs of protecting civilians and combatants from possible effects of LAWS well in advance before they will be developed. The Czech Republic, similarly as many other state parties to CCW, does not have a firm coordinated national position on or an approach towards many aspects of LAWS. Views that might be expressed in 2 national presentations, provided by our experts from the Czech Defense University will not represent a national position on any aspect of research, development or production or of future use of LAWS. Our hope is however, that we could start to build it on results of this meeting…”



“…El Ecuador considera inaceptable e inadmisible que decisiones fundamentales sobre la vida o muerte de los seres humanos sean asigndas a las armas letales autónomas. Los Estados debemos tomar acciones para prevenir la creación, desarrollo y detener las inversiones en el campo de las armas letales completamente autónomas, a través de normas y leyes nacionales que las prohíban y un protocolo internacional que prohiba su creación, desarrollo y uso…”



IStatement of the Arab Republic of Egypt at the Meeting of Experts on lethal autonomous weapons  13-16 May 2014. By: Ambassador Dr. Walid M. Abdelnasser Permanent  Representative of the Arab Republic of Egypt to the United Nations and other International Organization in Geneva  Geneva, May 13th, 2014

“…We hope that this informal meeting of experts on this issue works as an eye-opener on a very important and challenging development in the course of weaponry research and development and the relevant considerations in this regard, particularly with reference to the issue of the possible ramifications on the value of human lives, the calculation of the cost of war, as well as the possibility of the acquisition of this weapon by terrorist and organized crime networks. This should lead to a prohibition on acquisition, research and development, testing, deployment, transfer and use.
Until such result is achieved, we support calls to pose a moratorium on the development of such lethal technology in order to allow serious and meaningful international engagement with this issue. As military robotics gain more and more autonomy, the ethical questions involved will become even more complex. It might be too late after robotics have been fully developed to work on an appropriate response…”



Ministere Des Affaires Etrangeres. Convention sur Certaines Armes Classiques. Reunion informelle d’ experts sur les systemes d’armes létaux autonomes (SALA). Genéve, 13-16 mai 201r. Intervention Genérale.

“…A défaut de conclure sur tous les thémes qui seront discutés, nos devons au moins avoir pour ambition de rechercher une comprehension commune de ce que nous entendons par “systéme” d’arme létal autonome”. A cet égard deux elements semblent essentiels:

-nou parlons de tecnologies emergentes, en cours de developpment, et par consequent non encore utilisées dans des systemes d’armes existants;

-nous parlons de systemes autonomes, et non de systemes automatisés ou télé-óperes; ils impliquent donc une absence de supervision humaine. La délégation francaise reviendra plus en détail sur cette question dans ses interventions ultérieures”



CCW EXPERT MEETING LETHAL AUTONOMOUS WEAPON SYSTEMS Geneva, 13 – 16 May 2014 General Statement by Germany

“…We firmly believe that there should be a common understanding in the international community that it is indispensable to maintain meaningful human control over the decision to kill another human being. We cannot take humans out of the loop. We do believe that the principle of human control is already implicitly inherent to international humanitarian law which, as said before, remains our binding guiding line also with regard to new weapon systems. And we cannot see any reason why technological developments should all of a sudden suspend the validity of the principle of human control. Therefore, we suggest that in the discussion about the definition and legal evaluation of lethal autonomous weapon systems we should also talk about what we as an international community understand as meaningful human control and declare it an indispensable principle of international humanitarian law…”


Holy See

Statement by H.E Archbishop Silvano M. Tomasi. Permanent Representative of the Holy See to the United Nations and Other International Organizations in Geneve at the meeting of Experts on Lethal autonomous weapons systems of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which may be deemed to be excessively injurious or to Have Indiscriminate Effects 13 May 2014

“…For the Holy See the fundamental questions is the follolling: Can machines -well-programmed with highly sophisticated algorithms to make decisions on the battlefield which seek to comply with IHL-truly replace humans in decisions over life and death?

The answer is no. Humans must not be taken out of the loop over decisions regarding life and death for other human beings. Meaningful human intervention over such decisions must always be present…”



Permanent Mission of India to the Conference on Disarmament. Statement by Ambassador D.B. Venkatesh Varma. Permanent Representative of India to the Conference on Disarmament at the CCW Experts Meeting on Lethal Autonomous Weapons Systems. Geneva, May 13, 2014

“…We see current approaches as falling into two categories. The first is the view that a fresh look is needed on whether lethal autonomous weapon systems meet the criteria of international law and international humanitarian law, especially with regard to the principles of distinction, proportionality and precaution, and suggesting a preemptive ban on the research, production and use of LAWS or at least a moratorium until such time there is clarity on the overall implications. The other view is that there is a spectrum of autonomy inbuilt into existing weapons systems and that a prohibition on LAWS is either premature, unnecessary or unenforceable”



“I note that the press statement they issued yesterday focussed on the importance of ‘human control over the use of force’ and this seems to me to be a very sensible focus for these consultations and for whatever subsequent action we decide to take to build on these consultations. The definition of control, of course, is important in itself, in the context of ensuring that control is effective and not merely nominal”



CCW Meeting of Experts on Lethal Autonomous Weapons Systems Geneva (13-16 May). Statement by Ambassador Vinicio Mati. Permanent Representative of Italy to the Conference on Disarmament

“…We are convinced that the CCW has the merit to address not only the humanitarian concerns posed by existing weapons but also to prevent the development of new types of weapons that would be unacceptable under basic International Humanitarian Law principles. Therefore, we deem it very important that, within this framework, we will be abble to address new potential threats appearing on the horizon…”



Delegation of Japan to the Conference on Disarmament.Statement by H.E Ambassador Toshio SANO Permanent Representative of Japan to the Conference on Disarmament  Experts Meeting on Lethal Autonomous Weapons Systems to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excesively Injurious or to Have Indiscriminate Effects 13 May 2014, Geneva

“…as we deal with the emerging technology of LAWS, we are facing a challenge of defining LAWS. Although an agreement on the definition at this informal meeting is not likely, we believe it is imperative to develop a common understanding about what we perceive as LAWS in order to advance discussions.

In this regard, I would like to point out that if consider LAWS as “fully” lethal autonomous weapons systems, which once activated, can effectively select and engage a target without human intervention, we believe, at this stage, it is questionable such autonomous weapons could comply with international humanitarian law, and therefore, should be highlighted in our discussion. Also, while we may continue researching and developing non-lethal autonomous technology for defense purposes, we are not convinced of the need to develop “fully” lethal autonomous weapon systems which is completely out control of human intervention”




“…Coincidimos, por ello, en la necesidad de observar el desarrollo de nuevas tecnologías de armas en el contexto del respeto al derechohumano a la vida,y expresamos nuestra preocupación por la aparición de sistemas de armas autónomas letales que tengan el poder de decidir arbitrariamente sobre la vida o la muerte de los seres humanos. Reafirmamos que los Estados tienen la obligación de proteger y defender el derecho humano a la vid, y esa obligación no puede delegarse bajo ninguna circunstancia.

Además, reconocemos la aplicabilidad, en esta materia, delos límites y obligaciones preventivas previstas en el derecho internacional humanitario….”


New Zeland

Informal Meeting of Experts on Lethal Autonomous Weapons Systems. Statement by Joseph Ballard. Deputy Permanent Representative to the Conference on Disarmament 13 May 2014

“…Developments in the field of artificial intelligence hold considerable potential for impreving our lives. But, it is clear that, very importantly, they also offer the prespect of affecting the conduct of armed conflic and in doing so can present a number of benefits and risks. New Zealand, like many other High Contracting Parties, is beginning to consider the implications of these developments. Much will depend on how, collectively, we frame our discussions, on the definitions we use, and on the appropriate engagement of all relevant actors…”



CCW Meeting of Experts on Lethal Autonomous Weapons Systems 13-16 May 2014

“…Before launching an attack, a military commander is also requires to make a proportionality assessment between the incidetal harm which the attack may be expected to cause, and consider it against the military advantage anticipated. Launching an attack which may be expected to cause excessive incidental loss of civilian life or civilian objects is prohibited and must be halted or cancelled. An important question, is therefore whether a fully autonomous weapons could be programmed to make such a complicated analysis and judgement without human intervention?…”


Republic of Korea

“…I would also like to emphasize that the Republic ok Korea continues to pay due attention to ethical considerations surrounding the use of robot systems for civil and military purposes. Korea is working to enact an ethics charter on the commercial service of robotic technology in accordance with the National Plan on Intelligent Robots. I expect the charter to contain provision on ethical values and the code of conduct regarding the development, manufacture and use of robots…”


South Africa

CCW State ment by South Africa at the Meetin of Experts on Lethal Autonomous Weapons Sistems (LAWS), Geneva, Switzerland 2014.

“…The development of LAWS poses serious questions and there are many issues on which clarity is required. This includes definitional certainty as to the notion of autonomous and semi-autonomous weapons systems. While fully autonomous weapon systems are not being used currently, the future use of such weapon systems raises a whole range of issues to be considered. Of primary concem to my delegation are the humanitarian implications of their use and related ethical considerations. One of the key questions, in this regard thar should be of concern to all of us is whether these new technologies of warfare would be compliant with the rules of International Humanitarian Law, including those of distinction, proportionality and military necessity, as well as their potential impact on human rights.



Intervención de la Delegación Española Renión de Expertos sobre Sistemas de Armas Autónomos Letales, Ginebra, 13 de mayo de 2014

“…Creemos por ello que toda regulación futura debe pasar, de manera ineludible, por una fase previa de reflexión y definición, lo que, en el caso de las tecnologías emergentes, entraña una especial dificultad. Por la misma razón, veríamos como algo prematuro cualquier propuesta de moratoria sin antes definir, que califican este tipo de sistemas, plantean no pocos interrogantes…”



Remasks by Sweden at the Expert Meeting on LAWS in the CCW on 13-05-2014 (General Debate)

“…while it is true that many systems with various degrees of automation are being used or developed by states, it is not clear to us that this entails a move toward systems that would give full combat autonomy to machines…

A difficult issue is the threshold at which a weapon should be considered ‘autonomous’. Machine automation/autonomy exists on a continuum. An autonomous weapon implies one that is fully outside the control of a human. We very much doubt that this is a desirable development for a military forces. As a starting point, Sweden believes that when it comes to decisions on the use of fore against persons, human should never be “out of the loop…”



“…En premier lieu, une meilleure compréhension des développements technologiques liés à ces nouveaux systèmes est nécessaire….Les concepts tels que l’autonomie et l’étendue du spectre entre automatisation et autonomie doivent être clarifiés. Il serait intéressant d’utiliser des exemples concrets afin de déterminer les usages désirables, légaux et acceptables de systèmes aux fonctions autonomes, ainsi que d’identifier avec précision quels aspects des armes autonomes soulèvent des préoccupations. Une question centrale doit être posée, à savoir la capacité des machines à comprendre pleinement l’environnement dans lequel elles évoluent, à évaluer les risques ou à réaliser les évaluations qualitatives requises par le Droit International Humanitaire (DIH). D’autres questions tout autant critiques devront être traitées, comme l’utilisation civile de ces technologies et leur possibilité de double usage.

Deuxièmement, nous devons prendre en compte la dimension éthique de la ilitarisation de technologies de plus en plus autonomes. Il parait évident que le potentiel développement et emploi de systèmes d’armes létaux autonomes, à même de sélectionner et d’attaquer des cibles sans contrôle humain effectif (ou « meaningful human control »), soulèvent d’importantes préoccupations éthiques.

En troisième lieu, la question doit être approchée au niveau militaire et opérationnel. Nous devons identifier l’origine de l’intérêt militaire pour ces technologies, l’ampleur des avantages qui en sont attendus et les risques qui en découlent…”


United Kingdom

“…The subject of autonomous weapons systems is a complex one, and it is very useful for all concerned to have the opportunity here to depen understanding of the key themes. The agenda which France ahs prepared certainly includes consideration of the most relevant issues…”


United States

“…First, this important discussion is just beginning and we believe considerable work still needs to be to establish a common baseline of understanding among states. Too often, the phrase “lethal autonomous weapons systems” appears still to evoke the idea of a humanoid machine independently selecting targets for engagement and operating in a dynamic and complex urban environment

Second, it follows from the fact that we are  indeed at such an early stage or our discussions that the United States believes it is premature to determine where these discussions might or should lead. In our view, it is enough for now for us collectively to acknowledge the value in discussing lethal autonomous weapons systems in the CCW, a forum focused on international humanitarian law, which is the relevant framework for this discussion.

Third, we must bear in mind the complex and multifaceted nature of this issue…”


Margaret Atwood has much to say about killer robots

On people’s unease of automatons, writer Margaret Atwood says, ‘There is nothing more uncanny than something that is almost human.’ (Chris Young/THE CANADIAN PRESS)

Margaret Atwood has a foolproof plan to stop our increasingly intelligent and powerful machines from rising up and taking over control of the planet: Make sure any robots we build have an easy-access “off” switch.

It’s just the kind of wry, razor-like directness one expects from one of Canada’s premier writers – who has seen science-fiction ideas explored in her novels leap off the page and into real life.

Atwood shared her advice on short-circuiting the cybernetic threat with the kind of people who might actually build subservient robots: The Conference on Human Factors in Computing Systems (CHI 2014) was in Toronto this week and Atwood’s keynote address, “Robotics in My Work and Life,” was eagerly absorbed by 2,900 attendees from 47 countries. These computer scientists, engineers and software developers are focused on everything from interactive displays, modelling computer vision and 3-D interfaces to robot design.

It may seem like futuristic stuff, but according to Atwood, science, technology, discovery and invention are part of a feedback loop that feeds culture, fiction and literature. “I get my ideas from things that people are already doing, but may not have perfected yet. And sometimes I put them in books and then a couple of years later, lo and behold they’ve done it,” she says. Novels such as Oryx and Crake delved into potential social consequences of technologies that CHI 2014 attendees are working to perfect: haptic feedback, neural interfaces and robotics. Even the really far-out biotech nightmares she envisioned are leaping off the page and into laboratories. “I hear about this from my Twitter followers … any time there’s a lab-meat event or a ChickieNobs type of thing, or the kidneys in the pigs, they’ve now overcome the obstacles to that.”

Atwood also has real currency in cybernetic circles thanks to her invention of LongPen, a telematic remote-writing robot originally designed to let authors “attend” book signings from the comfort of their own home. The company formed to develop LongPen, Syngrafii Inc., has now moved to apply its remote-signature tech to banking, security and business applications.

The speech was a fascinating journey through humanity’s conflicted history with technology. Atwood speaks softly, and the packed Exhibit Hall G of the Metro Toronto Convention Centre barely made a sound, except for murmurs of chuckling at her dry jokes. In her view, the horror stories about the machines overthrowing their fleshy masters come from a deep pool of myth and folklore expressing our unease with the self-automated humanoid, things that are possibly alive or not alive. “There is nothing more uncanny than something that is almost human,” she says.

“All our stories about robotics are stories like that,” says Atwood in an interview after the keynote. “It’s what we have always worried about, it’s the sorcerer’s apprentice story: He learns how to do the charm, he doesn’t know how to turn it off. It’s the Golem story: You make the Golem, you activate it, it’s supposed to do your work for you and then it runs amok.” On a more mundane level, she says we fear robots because we can’t yet answer questions like: “Will they take over my job? Will they take over my thinking? Will they take on a life of their own?”

The flip side of the robot fantasy is that when men aren’t dreaming of being destroyed by robots, they mostly imagine having sex with them. “What men want to make is a woman who won’t laugh at them or reject them,” says Atwood. Again, the roots run deep: the Greek myth of Pygmalion and his Galatea, the perfect female statue the sculptor falls in love with (after becoming disgusted by flawed, human women) and whom the gods animate; the 19th-century comic ballet Coppélia about a wind-up woman (who could only come to life at the cost of her infatuated suitor’s life); and of course, the feminist-nightmare Stepford wives. Evil robot women are such a strong trope in 20th-century fiction that they are often the subject of parody (who could forget AustinPowers’s FemBots?).

One can’t help noticing that most of our fictional versions of robot-powered futures are dystopian nightmares. Atwood contends we might have a brighter view of the future had our civilization not experienced the trauma of “all the crappy stuff we did in the 20th century.” Meaning the world wars, the genocides and the horrifically lethal atomic and chemical technologies we invented along the way.

“In the 19th century it was all utopias, wall-to-wall utopia [stories], huge numbers of them … if you go back and start digging around, you find so many of them. It was also a century in which utopian communities were founded in large numbers, some of them in Canada by Finnish people on the West Coast.”

It started to shift around the turn of the century, tipping irrevocably after the First World War. “The carnage was unbelievable. It was very hard to imagine a utopia after that … harder to imagine the perfectibility of human society.”

But there is also a hopeful interpretation of these fictional forecasts of doom via robotic over throw or biological collapse: “We imagine it so we don’t make it.”

Let’s hope the robotic-interface designers of CHI 2014 were listening.

Source: GlobalandMail

The FAA Knew a Commercial Drone Pilot Broke No Laws, Fined Him $10K Anyway



Internal Federal Aviation Administration documents show that the result of an agency investigation determined commercial drone pilot Raphael Pirker violated no regulations—then the agency fined him $10,000 anyway.

Whether there’s actually a regulation making commercial drones illegal is the question that has been debated for months in a landmark, precedent-setting court case that has, at least temporarily, opened the skies for commercial drone operators. The question of whether the FAA actually believes it can fine someone simply for earning money while flying a drone (and not, for say, violating some other regulation in addition) is a question the FAA has refrained from answering during the the entire proceedings.

Now we know that, internally, the agency knows there is no regulation banning commercial drones.

The “Enforcement Investigative Report,” obtained by Motherboard through Pirker’s lawyer, Brendan Schulman, was unable to list a single regulatory violation related to Pirker’s October 17, 2011 flight at the University of Virginia. Regardless, the FAA fined Pirker $10,000 for the “reckless flight of an aircraft,” a charge that a federal judge has since thrown out. The FAA is appealing.

“On October 17,2011, Mr. Raphael Pirker conducted a number of commercial, Unmanned Aircraft System (UAS) flights around the University of Virginia (UVA) campus for the purpose of making a video of the campus and the new hospital wing contrary to the following 14 Code of Federal Regulations (CFR):,” the report reads.

And then, nothing. A huge blank space.

The report goes on to list the policy statement that Pirker violated, a voluntary guidelines document put out by the FAA in 2007 that is not legally binding.

Despite the investigation’s finding, the FAA tried to fine him using regulations written for manned aircraft in a move that “represents a moment unprecedented in American aviation history,” and an attempt to over broaden the definition of the word “aircraft” to the point where it is meaningless, according to Schulman, writing in a legal brief filed Monday.

Over and over again, the FAA has noted in the media that Pirker wasn’t fined for flying commercially, but over and over in its legal briefs, the agency has said that commercial operation of a drone is banned (according to the policy statement, not regulation). In its internal evaluation, the agency again continually notes the commercial nature of his flight.

This latest brief is likely to be the last before the National Transportation Safety Board eventually issues a ruling—a decision that could come any week now.

“The FAA [has] engaged in a campaign of intimidation against companies and individuals who were using model aircraft for business purposes, issuing cease and desist letters in an attempt to enforce [its] policy as if it were a binding regulation,” Schulman wrote.

That alleged intimidation campaign has continued, with the FAA recently attempting to fine a drone hobbyist—not a commercial pilot—for the first time ever. But since the original decision in March by Judge Patrick Geraghty, the FAA has had its hands full with drone pilots emboldened to disobey the FAA’s official stance that commercial use of drones is illegal. Companies and nonprofit groups are openly disobeying the administration’s guidance, and are asking the agency to show where, in the regulations, it says they cannot fly a drone legally.

It doesn’t say that anywhere, which is why the FAA has tried to say that Pirker’s five pound styrofoam drone is an “aircraft,” a definition that makes no sense considering that, in official documents, the agency has always distinguished between model aircraft and manned ones.

Schulman argues that if the NTSB overturns Geraghty’s decision based on the FAA’s new interpretation of the word “aircraft,” it would open up a huge can of regulatory worms.

“The FAA’s newfound concern for the harm that could be inflicted upon a university statue or railroad tracks by a 5-pound piece of styrofoam is not credible,” he wrote. “The [FAA] completely fails to address how to reconcile [its] proposed interpretation [of regulation] with the fundamental contradictions in the regulatory scheme that it creates.”

Among those contradictions: The FAA says that the minimum safe altitude for an aircraft is 500 feet. At the same time, its official documents suggest that the maximum safe altitude for a model aircraft is 400 feet. If both become “aircraft,” they cannot be reconciled.

Other regulations that apply to “aircraft” suggest that it’s a violation to fly one without briefing how to buckle a seatbelt and that is illegal to aim a laser pointer at an aircraft—something that’s certainly a problem if there’s a pilot onboard, but completely harmless if it’s a styrofoam drone.

Schulman wrote that, “having been caught trying to enforce the unenforceable, the FAA resorts to an absurd post hoc interpretation of the definition of ‘aircraft.’ All of these strained efforts are undertaken for a single purpose: to obscure the agency’s decade-long delay in issuing proposed unmanned aircraft regulations pursuant to the required notice and comment process required.”

The agency’s own internal document suggests that’s absolutely the case.

Pirker Investigation


Source: Motherboard 5/13(2014