keyboard_arrow_up
Accepted Papers
Advanced Drone Attack Detection Using 5g Open Ran Platform

Mohamed Jacem Guezguez1 and Olfa Besbes2, 1Cogicom, Paris, France, 2University of Monastir, Monastir, Tunisia

ABSTRACT

The fifth generation (5G) network represents the latest evolution in mobile communication technology, offering several significant advancements over its predecessors, including 4G (LTE) and 3G. These advancements include faster speeds, lower latency, and a wealth of new capabilities. In parallel, unmanned aerial vehicles (UAVs), commonly referred to as drones, are gaining increasing popularity and becoming more ubiquitous. Integrating drones with 5G networks unlocks new possibilities and applications that harness the high-speed, low-latency, and extensive connectivity features of 5G technology. However, the misuse of drones can pose various risks and concerns, including issues related to privacy invasion and safety hazards. In response to these challenges, this research paper presents an innovative 5G Open RAN platform, featuring programmable software deployed on 5G gNodeBs, enabling the collection and monitoring of radio-sensitive events in relation to drone intrusion attacks. Additionally, a radio-based detection technique is proposed to identify threats and block unauthorized drones, thus safeguarding private infrastructures. To illustrate the effectiveness of this platform, a case study is included, demonstrating its capabilities in addressing drone intrusion attacks at an airport.

KEYWORDS

Mobile Network, Drone Attacks, 5G Networks, Beamforming, Network Slicing.


How Computationally Powerful Are Transformer Language Models?

Jesse Roberts, Computer Science Department, Vanderbilt University, Nashville, TN

ABSTRACT

In this article we prove that the general transformer model undergirding modern large language models (LLMs) is Turing complete under reasonable assumptions. This is the first work to directly address the Turing completeness of the underlying technology employed in GPT-x as past work has focused on the more expressive, full auto-encoder transformer architecture. From this theoretical analysis, we show that the sparsity/compressibility of the word embedding is an important consideration for Turing completeness to hold. From our results we categorize Transformers as variants of B machines studied by Hao Wang.

KEYWORDS

Large Language Models, Transformers, Decoder-Only, Machine Learning, Transformer Theory.


A Mathematical Modelling of Curbing Bribery and Corruption Among Policemen in Nigeria

Sharhabil Tasiu A1, Oshokunuozaname Dania1, and Jafar Anafi1, 1Department of Public Health, Helpman Development Institute, Abuja, Nigeria, 2Department Public Policy, Helpman Development Institute, Abuja, Nigeria

ABSTRACT

This study presents a compartmental model capturing the dynamics of bribery and corruption among policemen in Nigeria, categorizing the population into distinct groups and analyzing transitions between them. The basic reproduction number R0 is calculated and found to be less than 1 which indicates that bribery will be eliminated in the population. Berkeley Madonna is utilized to conduct numerical simulations depicting the dynamics of bribery by representing the interactions among various groups through a set of diferential equations. This approach aims to visually illustrate how the intervention incuences the dynamics of bribery.


menu
Reach Us

emailccitt@ccitt2024.org


emailccittconf@yahoo.com

close