In a letter to the organisation, artificial intelligence (AI) leaders, including billionaire Elon Musk, warn of "a third revolution in warfare".
The letter says "lethal autonomous" technology is a "Pandora's box", adding that time is of the essence.
The 116 experts are calling for a ban on the use of AI in managing weaponry.
"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at times scales faster than humans can comprehend," the letter says.
"These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways," it adds.
There is an urgent tone to the message from the technology leaders, who warn that "we do not have long to act".
"Once this Pandora's box is opened, it will be hard to close."
Experts are calling for what they describe as "morally wrong" technology to be added to the list of weapons banned under the UN Convention on Certain Conventional Weapons (CCW).
Along with Tesla co-founder and chief executive Mr Musk, the technology leaders include Mustafa Suleyman, Google's DeepMind co-founder.
A UN group focusing on autonomous weaponry was scheduled to meet on Monday but the meeting has been postponed until November, according to the group's website.
A potential ban on the development of "killer robot" technology has previously been discussed by UN committees.
In 2015, more than 1,000 tech experts, scientists and researchers wrote a letter warning about the dangers of autonomous weaponry.
Among the signatories of the 2015 letter were scientist Stephen Hawking, Apple co-founder Steve Wozniak and Mr Musk.
What is a 'killer robot'?
A killer robot is a fully autonomous weapon that can select and engage targets without human intervention. They do not currently exist but advances in technology are bringing them closer to reality.
Those in favour of killer robots believe the current laws of war may be sufficient to address any problems that might emerge if they are ever deployed, arguing that a moratorium, not an outright ban, should be called if this is not the case.
However, those who oppose their use believe they are a threat to humanity and any autonomous "kill functions" should be banned.