The Valkyrie Drone Weapons System - Using ORBAI Combat AI
Victory on the battlefield of 2030 is not going to be decided by size, speed, nor firepower of the weapons systems. It is going to be decided by the intelligence and precision of those systems. As the threat has evolved in the 21st century from the militaries of large nation-states to smaller guerrilla and insurgent groups embedding within local populations and infrastructure, the nature of the mission for the armed forces has changed. Rarely do large, fixed military targets and troop concentrations present themselves, so conventional strike and bomber aircraft cannot be used with full effectiveness. Loss of these valuable assets in a theater where they have limited utility and constant exposure to portable AA weapons is too costly.
Smaller, more precise airborne weapons must be deployed in large numbers to bring the same firepower to the target, but there are often not enough human pilots to fly them, let alone to fly them together in precisely coordinated attacks. Today’s drones that are semi-autonomous are not intelligent enough to carry out these attacks on their own.
Enter the ORBAI Valkyrie weapons system, consisting of ‘swarms’ of a dozen med sized quadcopter drones that are all carried into the theater on one semi-autonomous (but human-directed), long range carrier aircraft. Each quadcopter drone is based on FPV Racing drone tech, and once it detaches from its carrier, it can pull 20g, reach 100mph speeds, carry 1/2lb of payload, and has 5+ mile range for both its battery and telemetry. They are so fast and maneuverable that no current AA technology or human-fired weapons could touch them.
The Valkyrie drones can be equipped with shaped high explosive, incendiary, fragmentation, recon, or other payloads, and are just as deadly as a guided missile with these warheads, except they are smart, smart far beyond todays guided and semi-autonomous weapons, and they can maneuver into extremely occluded spaces to get to their target, and precisely strike the most vulnerable spots with the most devastating payload to cripple a facility, insurgent cell, or vehicle caravan.
Each Valkyrie drone has a video camera that it feeds back to the carrier on a low-latency broadcast, along with its telemetry about its systems, GPS, gyros, and the carrier has all the AI neural networks to command and control the drones via the radio control link (in addition to limited autonomy with LOS).
Onboard the carrier drone is an ORBAI Combat AI based on our spiking neural networks for vision processing, audio processing, analysis, decision, and control subsystems of the AI. This gives this vision systems the ability to see and recognize specific people and objects in the theater, as well as learning to identify new ones and sharing this knowledge with their peers. The audio allows for voice control of the AI by the carrier pilot, giving the swarm joint commands and the drones individual commands, like attacking specific targets. The analysis, decision, and control subsystems make these drones completely autonomous once tasked and allow them to coordinate, plan, and time an attack against fixed targets, vehicles, or specific individuals. A Valkyrie is selective and specific, choosing who and what to remove from the theater of operations and take to Valhalla.
This analysis, decision, and control AI is trained and evolved in thousands of simulated missions, starting with a small AI core with basic insect intelligence, and progressively evolving it to become a formidable and intelligent Combat AI using the ORBAI patented NeuroCAD process. Each generation of drones feeds its experience into the collective and cumulative training set for the next generation to be train on and be evolved from. They learn from each other, and learn to work together to coordinate their tactics and attacks, so once dispatched by the carrier pilot, can be trusted to do their jobs efficiently and intelligently.
Using ORBAI’s Bidirectional Interleaved Complementary Hierarchical Neural Networks (BICHNNs), constructed by our NeuroCAD toolchain with a compact genome to full connectome expansions, we can efficiently perform genetic algorithms to specialize these networks into being optimal visual, speech, sensory, and even motion control cortices. Another novel behavior exhibited by these loops is that when properly set up and trained, when all inputs are turned off, they still hold internal state and continue to operate, meaning they have memory and logic, and can be evolved to do cognition or planning and give us a frontal cortex capable of complex decision making. In this manner, we can evolve most of the components we need to make an actual functional brain.
With that, we can assemble all the components we made into an artificial brain that would be functionally much more like a human brain, but we now architect it and evolve its macrostructure to best perform with the SNN technology, tools and processes used. We use artificial evolution to choose what to keep, what to substitute, and what to eliminate to build the artificial brain that optimally does the tasks we set it to, to create an optimal Combat AI, capable of piloting or driving autonomous vehicles on land, sea, and in the air, integrating their sensor information with real-time intel, planning the mission, autonomously piloting the vehicle throughout it, solving problems, overcoming obstacles, and deploying the reconnaissance and/or weapons necessary to achieve the mission.
We will have combat-capable AIs that (when we add traditional computational, information access, and deep learning capabilities for a specific combat role), will be narrow AIs, adept at a sufficient variety of localized tasks to function as super-human AI combatants doing a specific role, operating a specific platform and its weapons. This is an important intermediate step to creating a combat-capable AI with human-level or superhuman general intelligence. These also has obvious immediate use, putting these combat AIs to work in piloting more than just drones, but also land robots, autonomous tanks and vehicles, and gathering surveillance data, assessing the situation, choosing targets, and operating their weapons systems completely autonomously, far from human control. As they do so, they gather enormous amounts of data that are then gathered and used to train and evolve the next generation of their AIs. This can be done on a daily update cycle, so these generalized Combat AIs learn from the all the AIs previous day’s experience and we have new AI evolved overnight that learns from the previous day’s encounters, getting smarter and adapting to the opposing force’s tactics daily.
This has commercial application in civilian drones for mining, forestry, and other natural resources endeavors, for agriculture in farming and ranching. The ultimate civilian application is for autonomous vehicles to make self-driving cars that are fully level 5 autonomous and can drive without human input. This is still not possible with the commercial systems in the field today, and will require an adaptive, evolving AI that can learn in the field, like the ORBAI AI does. Only this autonomous driving system will have a USAF pedigree, and will have driven and flown in all terrains, all weather, under combat conditions, and will be the gold standard in self-driving.