Program
Day 1 - November 13, 2024
9:30 AM - 11:00 AM
Opening Session
Welcome & Keynote
WELCOME
KEYNOTE: On the Journey to 6G. Where Are We Today and What Can We Expect?
The next generation of mobile communication, 6G, will play a significant role in shaping the future of wireless. Research activities focused on 6G began gaining traction in 2019 and have since gained full momentum. This keynote will highlight key 6G technology components from a test & measurement perspective. Furthermore, an outlook will be provided regarding potential drivers and applications for 6G.
10:30 AM - 11:00 AM
Coffee Break & Networking
11:00 AM - 2:00 PM
Session 1
Mobile Communications
Why MIMO requires FinFET and what challenges an RF system engineer will need to overcome
In the presentation I outline why FinFET design is essential in delivering MIMO systems, showing that while drawbacks do exist they can be overcome, explaining the key design challenges and providing rules to help analog engineers.
Background
5G terrestrial and satellite communications have already begun to use bands in the FR2 range (24~52GHz). The range of wireless communications in these bands is limited by the short signal wavelength and transceivers therefore rely on the integration of MIMO systems to meet the link budget and mitigate interference. For the ASIC designer, this means more channels, more radios, more digital signal processing.
Power consumption
FinFET is the ideal technology for large MIMO systems; by enabling ‘digital radio’ solutions, based on RF-ADCs/DACs, it gives the flexibility to design ASICs that support multiple standards. There is, however, a drawback in implanting this architecture: power consumption. Energy-per-conversion increases exponentially when operating above 1 GHz; for example, moving the sampling frequency from 1 to 5GHz, increases the converter power by 20x.
Significant power can be saved by designing an RF analog front end (AFE), converting the multi-GHz RF signal to a lower intermediate-frequency (IF) below 500MHz, before performing the transition to the digital domain and relaxing the data-converters sampling frequency.
Further benefits
Using FinFET for analog design introduces multiple benefits: devices are compact, the high Gm and Rout are ideal to design analog amplifiers, RF designers can leverage the excellent high-frequency performance, with peak fT of 600GHz.
Again, there is a drawback to this approach too: the number of masks required by the technology is 2/3 times higher than in planar geometries. The amount of design rules increases accordingly, making the technology better suited for ‘machine driven’ digital implementations. Attempts to use custom analog layout flows are bound to fail, especially when using the smaller (7nm) nodes to achieve highest system integration and lowest power consumption.
Rules to design MIMO in FinFET technology
In FinFET, making a regular analog layout is paramount for success. Any variation from this approach may give some short term advantages in the initial schematic simulations, that disappear when parasitic resistors (due to metals and VIAs) are introduced. Simulations are slow and engineers need to resort to approximations to control simulation time.
The implementation challenges are significant, but so too are the benefits. The following rules can help engineers handling the intricacies of FinFET analog design:
1. Do not mix devices having different dimensions
2. Use repeatable patterns
3. Estimate interconnect parasitics from the start
4. Use digital calibration to correct analog errors
5. Current density limits the transmitter output power
6. Corner frequency of flicker noise is high
7. Simulations are slow
Background
5G terrestrial and satellite communications have already begun to use bands in the FR2 range (24~52GHz). The range of wireless communications in these bands is limited by the short signal wavelength and transceivers therefore rely on the integration of MIMO systems to meet the link budget and mitigate interference. For the ASIC designer, this means more channels, more radios, more digital signal processing.
Power consumption
FinFET is the ideal technology for large MIMO systems; by enabling ‘digital radio’ solutions, based on RF-ADCs/DACs, it gives the flexibility to design ASICs that support multiple standards. There is, however, a drawback in implanting this architecture: power consumption. Energy-per-conversion increases exponentially when operating above 1 GHz; for example, moving the sampling frequency from 1 to 5GHz, increases the converter power by 20x.
Significant power can be saved by designing an RF analog front end (AFE), converting the multi-GHz RF signal to a lower intermediate-frequency (IF) below 500MHz, before performing the transition to the digital domain and relaxing the data-converters sampling frequency.
Further benefits
Using FinFET for analog design introduces multiple benefits: devices are compact, the high Gm and Rout are ideal to design analog amplifiers, RF designers can leverage the excellent high-frequency performance, with peak fT of 600GHz.
Again, there is a drawback to this approach too: the number of masks required by the technology is 2/3 times higher than in planar geometries. The amount of design rules increases accordingly, making the technology better suited for ‘machine driven’ digital implementations. Attempts to use custom analog layout flows are bound to fail, especially when using the smaller (7nm) nodes to achieve highest system integration and lowest power consumption.
Rules to design MIMO in FinFET technology
In FinFET, making a regular analog layout is paramount for success. Any variation from this approach may give some short term advantages in the initial schematic simulations, that disappear when parasitic resistors (due to metals and VIAs) are introduced. Simulations are slow and engineers need to resort to approximations to control simulation time.
The implementation challenges are significant, but so too are the benefits. The following rules can help engineers handling the intricacies of FinFET analog design:
1. Do not mix devices having different dimensions
2. Use repeatable patterns
3. Estimate interconnect parasitics from the start
4. Use digital calibration to correct analog errors
5. Current density limits the transmitter output power
6. Corner frequency of flicker noise is high
7. Simulations are slow
Integration of Reflected Intelligent Surfaces into software defined networking environments
A Reflecting Intelligent Surface (RIS) is a programmable planar metal surface that can change the propagation of incident electromagnetic waves. It consists of a large number of inexpensive and passive reflective unit cells. These can be configured so that incident waves are reflected in a desired direction that is not the specular one.
As part of the RIS4NGWB project funded by the German Federal Office for Information Security, the use of RIS technologies in the backhaul of a 5G RAN in the FR2 frequency band is to be investigated. The focus of the work is on making communication between multiple endpoints more reliable and hardening it against eavesdropping.
In the project, TU Clausthal took over the project management and the design of a new type of reflecting intelligent surface in the mm-wave range. In the design, PIN diodes serve as the basic element of a passive reflective unit cell. Many of these unit cells form the RIS. The Friedrich-Alexander University is the second partner working on investigations into physical layer security. Fraunhofer IIS has created a MATLAB simulation model for analyzing the transmission quality of RIS components and integrating them into a software-defined networking environment.
The congress paper will present how a Reflecting Intelligent Surface works and is designed. This is followed by a presentation of typical applications for the use of a RIS, such as closing dead spots in mobile radio coverage. The project itself focuses on two areas of application: protecting against eavesdropping on a "point-to-point" connection in the backhaul and maintaining a connection in the event of temporary time-varying or static interference in the transmission beam to increase resilience.
The main part of the congress paper deals with the integration of the new RIS into a software-defined networking environment. A network element, the beam switch, was defined for this purpose. It coordinates the alignment of the beams between different end points. The beam switch controls the alignment of the beam of the transmitting antenna, the alignment of the sensitivity of the receiving antenna and the input angle or output angle of one or more reflecting intelligent surfaces. It is controlled by the open source SDN controller Open Daylight.
The beam switch uses the geographical coordinates of the end points and those of the RIS to calculate the orientation of the individual end points. These are used to define a flow. For this purpose, the coordinates are embedded in flow tables on the beam switch. The beam is switched when a flow is changed in the routing table of the beam switch. To do this, the switch calculates the angles to be set at the respective end points based on the entries in the routing table. The so-called Euler angles are used for this calculation.
The project results are verified with a demonstrator. This uses the experimental FR2 platform of Fraunhofer IIS, which realizes the physical layer of a 5G connection.
As part of the RIS4NGWB project funded by the German Federal Office for Information Security, the use of RIS technologies in the backhaul of a 5G RAN in the FR2 frequency band is to be investigated. The focus of the work is on making communication between multiple endpoints more reliable and hardening it against eavesdropping.
In the project, TU Clausthal took over the project management and the design of a new type of reflecting intelligent surface in the mm-wave range. In the design, PIN diodes serve as the basic element of a passive reflective unit cell. Many of these unit cells form the RIS. The Friedrich-Alexander University is the second partner working on investigations into physical layer security. Fraunhofer IIS has created a MATLAB simulation model for analyzing the transmission quality of RIS components and integrating them into a software-defined networking environment.
The congress paper will present how a Reflecting Intelligent Surface works and is designed. This is followed by a presentation of typical applications for the use of a RIS, such as closing dead spots in mobile radio coverage. The project itself focuses on two areas of application: protecting against eavesdropping on a "point-to-point" connection in the backhaul and maintaining a connection in the event of temporary time-varying or static interference in the transmission beam to increase resilience.
The main part of the congress paper deals with the integration of the new RIS into a software-defined networking environment. A network element, the beam switch, was defined for this purpose. It coordinates the alignment of the beams between different end points. The beam switch controls the alignment of the beam of the transmitting antenna, the alignment of the sensitivity of the receiving antenna and the input angle or output angle of one or more reflecting intelligent surfaces. It is controlled by the open source SDN controller Open Daylight.
The beam switch uses the geographical coordinates of the end points and those of the RIS to calculate the orientation of the individual end points. These are used to define a flow. For this purpose, the coordinates are embedded in flow tables on the beam switch. The beam is switched when a flow is changed in the routing table of the beam switch. To do this, the switch calculates the angles to be set at the respective end points based on the entries in the routing table. The so-called Euler angles are used for this calculation.
The project results are verified with a demonstrator. This uses the experimental FR2 platform of Fraunhofer IIS, which realizes the physical layer of a 5G connection.
Leveraging MATLAB for ISAC Research in 6G
The integration of sensing and communication (ISAC) is emerging as a cornerstone technology for 6G, promising unprecedented capabilities. This technical presentation focuses on leveraging MATLAB to explore and refine ISAC paradigms for 6G. We will discuss the design workflows in MATLAB that enable the modeling of complex ISAC scenarios, performance evaluation, and the development of novel algorithms from seamless integrated operation to effective coexistence. This presentation aims to equip researchers and practitioners with the insights and tools necessary to navigate the challenges of ISAC, leveraging MATLAB's robust capabilities to drive innovation in 6G research.
Advancements in Micro-acoustic Resonators
Advancements in Micro-acoustic Resonators
This session aims to provide a comprehensive overview of advancements in micro-acoustic resonators, primarily used in Radio Frequency (RF) filters and their role in the RF Front-End. Following an introduction to the fundamentals of RF filters, we will discuss different types of micro-acoustic filters and the materials and processes used for developing such devices. We will provide insights into filter design approaches and R&D hotspots for surface acoustic wave (SAW) and bulk acoustic wave (BAW) filters.
These insights will include a discussion of various design considerations including an exploration of distinct types of acoustic wave modes excited by different piezoelectric materials and their associated filter design trade-offs. The presentation will also explain how such design considerations accommodate the push towards higher frequencies and more complicated band combinations due to further developments in 5G and 6G communications. In the final section, we will highlight the packaging challenges surrounding micro-acoustic devices.
Presenter:
Christian Hoffmann (PhD)
Christian Hoffmann is currently employed by RF360 Europe GmbH (Qualcomm) as a Principal Engineer in the New Technology Business Development department in Munich. He received a diploma in Physics from Aachen University of Technology in Germany, as well as a doctoral degree in Electrical Engineering from the same university. Mr. Hoffmann has extensive experience in material science and RF engineering and has held a variety of positions within the RF360 organization and related entities over the course of 27 years including VP Corporate Material Research in Austria, Senior Chief Researcher in Tokyo, Japan, and as a member of the CTO Office in Munich, Germany. His focus is on RF materials, packaging, and RF engineering.
This session aims to provide a comprehensive overview of advancements in micro-acoustic resonators, primarily used in Radio Frequency (RF) filters and their role in the RF Front-End. Following an introduction to the fundamentals of RF filters, we will discuss different types of micro-acoustic filters and the materials and processes used for developing such devices. We will provide insights into filter design approaches and R&D hotspots for surface acoustic wave (SAW) and bulk acoustic wave (BAW) filters.
These insights will include a discussion of various design considerations including an exploration of distinct types of acoustic wave modes excited by different piezoelectric materials and their associated filter design trade-offs. The presentation will also explain how such design considerations accommodate the push towards higher frequencies and more complicated band combinations due to further developments in 5G and 6G communications. In the final section, we will highlight the packaging challenges surrounding micro-acoustic devices.
Presenter:
Christian Hoffmann (PhD)
Christian Hoffmann is currently employed by RF360 Europe GmbH (Qualcomm) as a Principal Engineer in the New Technology Business Development department in Munich. He received a diploma in Physics from Aachen University of Technology in Germany, as well as a doctoral degree in Electrical Engineering from the same university. Mr. Hoffmann has extensive experience in material science and RF engineering and has held a variety of positions within the RF360 organization and related entities over the course of 27 years including VP Corporate Material Research in Austria, Senior Chief Researcher in Tokyo, Japan, and as a member of the CTO Office in Munich, Germany. His focus is on RF materials, packaging, and RF engineering.
1:00 PM - 2:00 PM
Lunch & Networking
11:00 AM - 2:00 PM
Session 2
DECT NR+
DECT NR+ standardization progress and spectrum regulation activities
Presentation provides update on ETSI TC DECT standardization activities on developing interoperability requirements, introducing release 2 content and items under work for future releases. TC DECT has also completed the technical report introducing DECT-2020 NR technology and examples of use cases. This report is used in spectrum regulation work within Europe and possibly also in other regions.
Radio administrations are working on local and shared spectrum operations which DECT NR+. We provide an update on technical requirements available and activities ongoing.
Radio administrations are working on local and shared spectrum operations which DECT NR+. We provide an update on technical requirements available and activities ongoing.
NR+ is leaving the lab – first massive scale field tests in Smart Metering
Following the establishment of NR+ as a 5G SRIT, Wirepas released a generally available stack tailored very much to the needs of large scale and utility driven electricity metering roll-outs. Faced with the two most complex challenges in IoT – Scale and Reliability – Smart Metering has been the leading use case in the segment of Industrial IoT. While many wired, wireless and sometimes weird technologies have been deployed – a globally available and affordable implementation was not developed. Wirepas, together with many partners is embracing RF Mesh as default topology for the 1900 MHz band – a perfect match to integrate many (expensive) lessons learned over the last decades in connectivity. The presentation will share the performance numbers of the pilots and test networks established mainly in India – laying the foundation for the nationwide Rollout of up to 500 Mio devices until the end of this decade. We will cover scale, range, density, coexistence and the path for a powerful connectivity standard to a widely accepted implementation.
DECT NR+ radio conformance testing
The NR+ radio conformance specification is under development. NR+ is a relatively new radio standard, the market is emerging and turn-key solutions for conformance testing do not yet exist in the market. Explanation of the radio conformance requirements that have been defined for NR+ and how to test them.
Accelerating Innovation through Collaboration: A Glimpse into the Franco-German MERCI Project
Co-funded by both German and French ministries, the MERCI project exemplifies Franco-German cooperation in pioneering the development of novel solutions for private wireless networks. Integral to this initiative is the incorporation of cutting-edge DECT NR+ technology. Since November 2022, the dedicated project team has efficiently generated prototypes applicable to sectors such as Professional Audio, Professional Video, Industry 4.0, and IoT.
The forthcoming presentation offers an exciting opportunity to spotlight the progressive achievements and potential impact of employing DECT NR+. Through this platform, we aim to shed light on the development work carried out, the outcomes of comprehensive evaluations, and share details of selected prototypes which underscore the potent capabilities of DECT NR+.
The forthcoming presentation offers an exciting opportunity to spotlight the progressive achievements and potential impact of employing DECT NR+. Through this platform, we aim to shed light on the development work carried out, the outcomes of comprehensive evaluations, and share details of selected prototypes which underscore the potent capabilities of DECT NR+.
1:00 PM - 2:00 PM
Lunch & Networking
2:00 PM - 4:00 PM
Session 3
Mobile Communications
3GPP trends towards 5G-Advanced and 6G technologies
The evolution of 3GPP (the 3rd Generation Partnership Project) standards has made tremendous progress with 5G. Initial 5G has targeted three main use cases enhanced Mobile Broadband (eMBB), ultra-reliable and low-latency (URLLC), and massive IoT, new verticals were added in later releases. To further enhance 5G capabilities, 5G Advanced builds on the 5G baseline defined by 3GPP in Releases 15, 16, and 17.
3GPP Release 18 is the starting point of 5G Advanced. 5G Advanced will enhance network performance and add support for new applications and use cases such as the metaverse, Redcap devices, and the joint communication and sensing (JCAS) within the same system or network. The journey to enhance the network performance with AI/ML, applying AI/ML for indoor positioning and AI/ML area to support network energy saving has just started and to be enhanced in Release 19 and beyond.
Several of the 5G advanced technology components can be seen as starting point to some of the 6G building blocks. For example, AI/ML will play an important role in the fully data-driven architecture of 6G.
This presentation provides an overview of 3GPP journey towards 5G advance and 6G technologies.
3GPP Release 18 is the starting point of 5G Advanced. 5G Advanced will enhance network performance and add support for new applications and use cases such as the metaverse, Redcap devices, and the joint communication and sensing (JCAS) within the same system or network. The journey to enhance the network performance with AI/ML, applying AI/ML for indoor positioning and AI/ML area to support network energy saving has just started and to be enhanced in Release 19 and beyond.
Several of the 5G advanced technology components can be seen as starting point to some of the 6G building blocks. For example, AI/ML will play an important role in the fully data-driven architecture of 6G.
This presentation provides an overview of 3GPP journey towards 5G advance and 6G technologies.
Towards the Next Generation of Wireless Infrastructure with Sensing
Cellular communications connects people around the world using radio-frequency spectrum. The harmonization of global spectrum is precisely defined and managed by the International Telecommunication Union – Radiocommunication Sector (ITU-R). Without the support of governments, regulatory authorities, and industry experts, we would not have been able to enjoy the benefits of wireless internet in 3G, internet of applications in 4G, and ubiquitous connectivity in 5G. These generational advancements are driven not only by the insatiable and ever-increasing demand for capacity but also by the rising number of smart devices and subscriptions, as well as an increase in the average data volume per subscription. However, upon closer examination, we find an intricate set of requirements and design complexities that intertwine. These include market requirements and applications, technical specifications, and radio technology developments, all of which must align with wireless standardization activities.
The ITU-R has developed a framework for IMT-2030 (also known as 6G), which encompasses emerging services, applications, and technical aspects. One of the pillars of this framework is the integrated multi-dimensional sensing technology, which aims to enhance high-precision positioning through object and presence detection, localization, imaging, and mapping. Furthermore, communication operations at higher emerging spectrum (also known as FR3 band) and the proliferation of wireless nodes in the network lead to new innovations in radio components and signal processing algorithms within the radio unit. This, in turn, allows communication devices to additionally serve as sensing devices. From an infrastructure perspective, such a technology enables communication-centric services, including environment monitoring, network resource optimization and interference management for base stations and industrial Internet of Things. However, the requirements and technical specifications vary depending on the specific application. Nevertheless, the exploitation of full-duplex radio circuits in a radar-like operation for joint monostatic sensing and communications is expected to be a prominent feature in future wireless systems due to the additional position and velocity information it provides to the network, albeit at an incremental cost due to hardware constraints.
In this presentation, we will explore trends in global standardization activities, broader market applications, technical requirements, and fundamental technical challenges by analyzing two distinct use cases, i.e., smart cities and smart factories. We will also examine the integration of sensing and communications for broad market adoption from an industrial perspective.
The ITU-R has developed a framework for IMT-2030 (also known as 6G), which encompasses emerging services, applications, and technical aspects. One of the pillars of this framework is the integrated multi-dimensional sensing technology, which aims to enhance high-precision positioning through object and presence detection, localization, imaging, and mapping. Furthermore, communication operations at higher emerging spectrum (also known as FR3 band) and the proliferation of wireless nodes in the network lead to new innovations in radio components and signal processing algorithms within the radio unit. This, in turn, allows communication devices to additionally serve as sensing devices. From an infrastructure perspective, such a technology enables communication-centric services, including environment monitoring, network resource optimization and interference management for base stations and industrial Internet of Things. However, the requirements and technical specifications vary depending on the specific application. Nevertheless, the exploitation of full-duplex radio circuits in a radar-like operation for joint monostatic sensing and communications is expected to be a prominent feature in future wireless systems due to the additional position and velocity information it provides to the network, albeit at an incremental cost due to hardware constraints.
In this presentation, we will explore trends in global standardization activities, broader market applications, technical requirements, and fundamental technical challenges by analyzing two distinct use cases, i.e., smart cities and smart factories. We will also examine the integration of sensing and communications for broad market adoption from an industrial perspective.
Challenges in the Assessment of Human RF Exposure to Mobile Communications including 5G and 6G
Mobile data transmission and communication technologies have continuously evolved over decades. Currently, the 5th generation mobile communication standard (5G New Radio) is being rolled out, and the 6th generation (6G) is under research. The introduction of 5G and thus massive MIMO transmission enables beamforming and steering and optimizes the transmission channel according to the actual channel state in terms of resource allocation and data rates. In addition to the frequency range 0.41…7.125 GHz (FR1), further spectrum between 24.25 and 52.6 GHz (FR2) has been designated for 5G mobile communications, which will be even extended up to 300 GHz for the 6G standard.
The new transmission technologies and frequency ranges challenge the research and implementation of efficient and accurate techniques to assess the associated RF exposure, with a focus on base stations, as part of risk communication to enable smooth introduction and broad acceptance. Standard-specific assessment methods are required to evaluate the electromagnetic fields (EMF) under maximal load operation, and to compare them against legal exposure limits. These methods rely on numerical simulations, measurements, or combinations thereof.
Ray-optical methods present a powerful tool to simulate EMF distributions, e.g., around base stations. The spatial resolution with which the propagation environment is modelled must increase in accordance to the increasing carrier frequencies; daily-use models like OpenStreetMaps are no longer accurate enough for this purpose. We are currently performing research of digital twins based on three-dimensional high-resolution map data for EMF assessment. Such digital twins help to account for multipath propagation effects in a mobile radio channel and, in consequence, to adjust appropriate measurement methods.
Concerning measurement techniques, the maximum exposure is usually derived from measuring the field strengths associated with time-constant broadcast signals, and extrapolating these values to maximum-load conditions. However, the introduction of massive MIMO has challenged this approach, as the broadcast signals are transmitted with a different antenna gain pattern than the communication traffic signals. This diversity requires an accurate correction of the location-dependent gain differences between traffic and broadcast beam patterns. In addition, techniques based on provoking data traffic from the base station using a mobile phone are under study. Depending on the channel state, in present 5G NR base stations employing massive MIMO techniques, data can be transmitted simultaneously to the user terminal via up to four beams (rank-1 to rank-4 transmission). The rank has a significant impact not only on the resulting data rate but on the exposure as well. Hence, the channel state must be considered for correct exposure assessment.
This contribution addresses current research topics and indicates possible solutions.
The new transmission technologies and frequency ranges challenge the research and implementation of efficient and accurate techniques to assess the associated RF exposure, with a focus on base stations, as part of risk communication to enable smooth introduction and broad acceptance. Standard-specific assessment methods are required to evaluate the electromagnetic fields (EMF) under maximal load operation, and to compare them against legal exposure limits. These methods rely on numerical simulations, measurements, or combinations thereof.
Ray-optical methods present a powerful tool to simulate EMF distributions, e.g., around base stations. The spatial resolution with which the propagation environment is modelled must increase in accordance to the increasing carrier frequencies; daily-use models like OpenStreetMaps are no longer accurate enough for this purpose. We are currently performing research of digital twins based on three-dimensional high-resolution map data for EMF assessment. Such digital twins help to account for multipath propagation effects in a mobile radio channel and, in consequence, to adjust appropriate measurement methods.
Concerning measurement techniques, the maximum exposure is usually derived from measuring the field strengths associated with time-constant broadcast signals, and extrapolating these values to maximum-load conditions. However, the introduction of massive MIMO has challenged this approach, as the broadcast signals are transmitted with a different antenna gain pattern than the communication traffic signals. This diversity requires an accurate correction of the location-dependent gain differences between traffic and broadcast beam patterns. In addition, techniques based on provoking data traffic from the base station using a mobile phone are under study. Depending on the channel state, in present 5G NR base stations employing massive MIMO techniques, data can be transmitted simultaneously to the user terminal via up to four beams (rank-1 to rank-4 transmission). The rank has a significant impact not only on the resulting data rate but on the exposure as well. Hence, the channel state must be considered for correct exposure assessment.
This contribution addresses current research topics and indicates possible solutions.
3:30 PM - 4:00 PM
Coffee Break & Networking
2:00 PM - 4:00 PM
Session 4
Campus/Non-Public Networks
5G-ACIA Insights into 5G for Industrial IoT and Non-Public Networks
Industrial IoT (IIoT) is the biggest enterprise market segment, and therefore Industrial IoT was given high priority in 5G standardization. The timeline for 5G and the 4th industrial transformation (Industry 4.0) seemed to align perfectly, so the joint investment in standards and development seemed to make perfect sense. The needs of this vertical were studied very carefully, and requirements to 5G standards were drafted based on inputs from key industrial automation companies. The standout features for Industrial IoT included support for time sensitive communication and 1ms latency together with URLLC features as well as options for non-public networks (NPN).
5G-ACIA published a whitepaper on 5G NPNs in March 2024. 5G NPNs are already used for industrial applications today. When we analyze the different NPN deployment options, we start from real life-type example industrial operation scenarios. Then we present example NPN deployment options that support the use cases, are based on standard technology and in line with the market offering today. From there we discuss the different aspects that affect selecting the deployment option as well.
Now, at the time of early but defining first steps of 6G, when we look at the 5G situation, we learn there is still some way to go to meet some expectations on IIoT specifics. The current IIoT deployments focus on use cases such as connected workers, worker safety and machine digital twins, which mainly do not utilize features specifically designed for IIoT, and many of the networks are still using 4G technology.
This presentation also discusses what the experiences from the marketplace are, what is good and where we still need improvements, and most importantly how we can use the experience going forward to 6G. Based on our experience, we explain why 5G IIoT is taking more time than expected, how standardization should address this learning, and in specific why some of the 5G IIoT requirements are still relevant for 6G standards development and how to phase the work.
Nokia is one of the founding members of 5G-ACIA and Uli Rehfuess will proovide some insightful views on where we on the road towards Industrial 5G are.
5G-ACIA published a whitepaper on 5G NPNs in March 2024. 5G NPNs are already used for industrial applications today. When we analyze the different NPN deployment options, we start from real life-type example industrial operation scenarios. Then we present example NPN deployment options that support the use cases, are based on standard technology and in line with the market offering today. From there we discuss the different aspects that affect selecting the deployment option as well.
Now, at the time of early but defining first steps of 6G, when we look at the 5G situation, we learn there is still some way to go to meet some expectations on IIoT specifics. The current IIoT deployments focus on use cases such as connected workers, worker safety and machine digital twins, which mainly do not utilize features specifically designed for IIoT, and many of the networks are still using 4G technology.
This presentation also discusses what the experiences from the marketplace are, what is good and where we still need improvements, and most importantly how we can use the experience going forward to 6G. Based on our experience, we explain why 5G IIoT is taking more time than expected, how standardization should address this learning, and in specific why some of the 5G IIoT requirements are still relevant for 6G standards development and how to phase the work.
Nokia is one of the founding members of 5G-ACIA and Uli Rehfuess will proovide some insightful views on where we on the road towards Industrial 5G are.
PKI in infrastructures for industrial automation technology with 5G
According to the latest studies by TÜV, the number of cyber attacks on industrial companies is growing continuously. At the same time, the number of 5G campus networks in industrial environments is increasing. New use cases are combining 5G communication with traditional production technology. In the past, physical isolation of the OT (especially when using wired communication) and network segmentation protected unsecured end devices from unauthorized access. When using the air interface of wireless communications technologies, such as 5G or WLAN, access to connected networks can no longer be protected by physical access controls alone.
This paper presents a PKI based security framework meant to protect the point of connection between the 5G network, company internal IT networks and OT networks. It aims to support companies in securely integrating their IT and OT networks and realizing their Industry 4.0 use cases, by offering automated certificate management for the OT domain. The presented security framework assumes three distinct security domains: the 5G network itself, an OT domain, and an IT domain. The 5G core offers a range of inbuilt security functionality that can potentially support certificate management over the entire operating cycle of an automation component. The IT domain is expected to employ commonly used measures of ensuring IT security within an internal network, e.g. Firewalls, Network Access Control and user authentication and authorization services. While IT networks traditionally focus on confidentiality, availability, and integrity, OT components focus on reliability, authenticity, and compliance with latency limits. The OT domain may feature a variety of end devices that may support only very specific, sometimes outdated, security features, or possible no security features at all. Our framework aims to bridge the gap between the high level of security already present in the 5G and IT domains and the differing levels of security found in existing OT installations.
The implemented security framework is integrated into a multi-access edge computing (MEC) environment and has direct access to the 5G core network. It provides certificate based authentication of devices and users in differing domains, enabling secure communication between domains, e.g. between IT and OT. It also supports secure communication between end devices and the MEC, and among OT devices. Selected Use Cases will be presented to demonstrate how this framework can be used in practice. Typical industrial use cases such as localization and communication between production systems will be showcased.
[The presented Use Cases and security architecture are developed within the research project “PKI in infrastructures for industrial automation technology” (PIA5) founded by the Federal Office for Information Security (BSI).]
This paper presents a PKI based security framework meant to protect the point of connection between the 5G network, company internal IT networks and OT networks. It aims to support companies in securely integrating their IT and OT networks and realizing their Industry 4.0 use cases, by offering automated certificate management for the OT domain. The presented security framework assumes three distinct security domains: the 5G network itself, an OT domain, and an IT domain. The 5G core offers a range of inbuilt security functionality that can potentially support certificate management over the entire operating cycle of an automation component. The IT domain is expected to employ commonly used measures of ensuring IT security within an internal network, e.g. Firewalls, Network Access Control and user authentication and authorization services. While IT networks traditionally focus on confidentiality, availability, and integrity, OT components focus on reliability, authenticity, and compliance with latency limits. The OT domain may feature a variety of end devices that may support only very specific, sometimes outdated, security features, or possible no security features at all. Our framework aims to bridge the gap between the high level of security already present in the 5G and IT domains and the differing levels of security found in existing OT installations.
The implemented security framework is integrated into a multi-access edge computing (MEC) environment and has direct access to the 5G core network. It provides certificate based authentication of devices and users in differing domains, enabling secure communication between domains, e.g. between IT and OT. It also supports secure communication between end devices and the MEC, and among OT devices. Selected Use Cases will be presented to demonstrate how this framework can be used in practice. Typical industrial use cases such as localization and communication between production systems will be showcased.
[The presented Use Cases and security architecture are developed within the research project “PKI in infrastructures for industrial automation technology” (PIA5) founded by the Federal Office for Information Security (BSI).]
Integration of OPC UA in 5G Networks
OPC UA is a platform-independent standard that is used for communication among various types of devices and systems in the industrial domain. In order to integrate OPC UA FX with the 5G system, it is important to understand the core principles of the OPC UA, since the 5G system is the medium for establishing and maintaining connectivity between the components and various automation functions represented by FunctionalEntities (FEs). This report provides an overview of OPC UA FX and then investigates how it could be integrated with 5G and assesses its technology readiness level.
3:30 PM - 4:00 PM
Coffee Break & Networking
4:00 PM - 5:30 PM
Session 5
Mobile Communications
5G NTN Take’s Flight: 5G Non-terrestrial Networks Evolving Towards 6G
From the beginning, 3GPPs vision of ubiquitous communication included the usage of satellite or airborne based network components.
Heading towards 6G we envision the contribution of NTN into the 6G system, IMT-2030 sets some clear requirements incorporating a non-terrestrial, unified network architecture from its inception. Release 17 considered as the start of NTN in 5G targets at enabling non-terrestrial network services. Later releases will enhance the NTN contribution and even if 6G may be an evolution and not a revolution, NTN will play a pivotal role from the advent of 6G.
This presentation will outline the technology evolution from the first NTN up to the anticipated technology evolutions on the path to 6G. The NTN technology starts with transparent mode architecture, incorporates regenerative mode and multi-connectivity mobility scenarios on a mid-term basis, and then looks towards long-term aspects like 3D unified and resilient networks, ultimately paving the way towards “beyond cellular” in a 6G network that includes NTN from its inception.
In addition, we would like to present the evolution of Test and Measurement aspects and how T&M can ensure the successful operation and development of future NTN networks.
Heading towards 6G we envision the contribution of NTN into the 6G system, IMT-2030 sets some clear requirements incorporating a non-terrestrial, unified network architecture from its inception. Release 17 considered as the start of NTN in 5G targets at enabling non-terrestrial network services. Later releases will enhance the NTN contribution and even if 6G may be an evolution and not a revolution, NTN will play a pivotal role from the advent of 6G.
This presentation will outline the technology evolution from the first NTN up to the anticipated technology evolutions on the path to 6G. The NTN technology starts with transparent mode architecture, incorporates regenerative mode and multi-connectivity mobility scenarios on a mid-term basis, and then looks towards long-term aspects like 3D unified and resilient networks, ultimately paving the way towards “beyond cellular” in a 6G network that includes NTN from its inception.
In addition, we would like to present the evolution of Test and Measurement aspects and how T&M can ensure the successful operation and development of future NTN networks.
Unified & ML Optimized Wireless Testbeds for IoT Communication Evaluation & Hands on training
The Internet of Things (IoT) integration poses major challenges for many companies because many complex and new communication solutions from short range wireless, LPWAN to next generation cellular (5G/5.xG /6G) technologies can be used. Its essential to systematically evaluate and decide on the suitable IoT communication solution based on business and technical needs especially for SMEs and corporates from non-IoT/ Communication background. There is also need for hands on training and experimentation environments of IoT Communication technologies, systems, and applications in higher education market.
This presentation will begin with an introduction to the challenges posed by IoT communication technologies and covering its various technological landscape such as short-range wireless, LPWAN, and next-generation cellular (5G/6G). It will then discuss into the need for unified testing requirements and methodologies for evaluating these technologies explore the systematic evaluation and selection of appropriate IoT communication solutions, emphasizing the need for unified testbed methodologies for evaluation and hands on training. Drawing the requirements and results from author’s involved industrial and academic R&D projects at Offenburg University and spin off IICT, it will cover the design and implementation of practical testbed environments for training and evaluating IoT communication technologies. The session will also highlight the integration of machine learning algorithms for network optimization, showcasing real-world use cases and the protocol-agnostic implementation of test scenarios.
The presentation will conclude with discussion on current challenges and future directions, will address the ongoing development of IoT communication testbeds and the importance of continued innovation in this field. It will highlight opportunities for collaboration between academia and industry, particularly in developing industry-relevant use cases and hands on training modules. This 30-minute presentation aims to provide a comprehensive, technical overview of IoT communication evaluation and hands on training methodologies, with a strong focus on practical applications and real-world scenarios.
This presentation will begin with an introduction to the challenges posed by IoT communication technologies and covering its various technological landscape such as short-range wireless, LPWAN, and next-generation cellular (5G/6G). It will then discuss into the need for unified testing requirements and methodologies for evaluating these technologies explore the systematic evaluation and selection of appropriate IoT communication solutions, emphasizing the need for unified testbed methodologies for evaluation and hands on training. Drawing the requirements and results from author’s involved industrial and academic R&D projects at Offenburg University and spin off IICT, it will cover the design and implementation of practical testbed environments for training and evaluating IoT communication technologies. The session will also highlight the integration of machine learning algorithms for network optimization, showcasing real-world use cases and the protocol-agnostic implementation of test scenarios.
The presentation will conclude with discussion on current challenges and future directions, will address the ongoing development of IoT communication testbeds and the importance of continued innovation in this field. It will highlight opportunities for collaboration between academia and industry, particularly in developing industry-relevant use cases and hands on training modules. This 30-minute presentation aims to provide a comprehensive, technical overview of IoT communication evaluation and hands on training methodologies, with a strong focus on practical applications and real-world scenarios.
An Operator’s perspective on the e2e ecosystem for both consumer and business IoT
An Operator’s perspective on the end-to-end ecosystem for both consumer and business Internet of Things (IoT). Covering the following aspects:
• Device considerations
• Evolution of cellular radio for IoT, e.g. NB-IoT, LTE-M, Redcap and Ambient IoT
• Wireless radio and networks, e.g. ZigBee, Wi-Fi, Thread
• Application and protocol aspects e.g. Matter, MQTT, LWM2M
• IoT platforms and solutions for data ingestion, analytics, and reporting
• Data exchange capabilities for secure data sharing
• IoT architecture principles including security by design
- IoT ecosystem
• IoT standards
• IoT futures and research.”
• Device considerations
• Evolution of cellular radio for IoT, e.g. NB-IoT, LTE-M, Redcap and Ambient IoT
• Wireless radio and networks, e.g. ZigBee, Wi-Fi, Thread
• Application and protocol aspects e.g. Matter, MQTT, LWM2M
• IoT platforms and solutions for data ingestion, analytics, and reporting
• Data exchange capabilities for secure data sharing
• IoT architecture principles including security by design
- IoT ecosystem
• IoT standards
• IoT futures and research.”
4:00 PM - 5:30 PM
Session 6
Campus/Non-Public Networks
Implementing Innovative Cloud-Based SASE Technologies for Enhanced Security in 5G Networks - Cancelled
Topic: Implementing Innovative Cloud-Based SASE Technologies for Enhanced Security in 5G Networks
Speaker: Florian Rutsch, Senior Sales Engineer & Ivan Majdan, Regional Sales Director CEUR bei Cradlepoint, part of Ericsson
The advent of 5G technology introduces both opportunities and challenges for modern enterprises, particularly in the realm of network security. This presentation will explore the application and benefits of cloud-based Secure Access Service Edge (SASE) solutions that integrate software-defined wide area networking (SD-WAN) with advanced security functions to create robust, agile networks. The session will illustrate the increasing complexity and security risks associated with the expanding networks of IoT devices and agile mobile connectivity, which is needed for modern businesses, and how SASE technologies address these challenges – by employing zero-trust security models and remote browser isolation to safeguard against unauthorized access and cyber threats.
Possible talking points in speaker session:
1. Introduction to 5G and Wireless Network Challenges:
● Overview of 5G technology advancements and the associated
increase in connected devices.
● Discussion of new security vulnerabilities introduced by these
developments.
2. Understanding SASE:
● Definition of Secure Access Service Edge (SASE) and its importance
in modern network architectures.
● How SASE combines SD-WAN and security features into a unified
cloud-based service.
3. Core Components of SASE Solutions:
● Deep dive into zero-trust security models: principles and advantages.
● The role of remote browser isolation in protecting user endpoints
from malicious online activities.
4. Implementation and Benefits:
● Case studies on deployment strategies and the operational
challenges and solutions.
● Discussion of the tangible benefits of SASE
5. Future Trends and Predictions:
● Potential developments in SASE technology as wireless connectivity
continues to evolve.
● Discussion on AI and machine learning applications for dynamic
security measures and traffic management in SASE environments
Speaker: Florian Rutsch, Senior Sales Engineer & Ivan Majdan, Regional Sales Director CEUR bei Cradlepoint, part of Ericsson
The advent of 5G technology introduces both opportunities and challenges for modern enterprises, particularly in the realm of network security. This presentation will explore the application and benefits of cloud-based Secure Access Service Edge (SASE) solutions that integrate software-defined wide area networking (SD-WAN) with advanced security functions to create robust, agile networks. The session will illustrate the increasing complexity and security risks associated with the expanding networks of IoT devices and agile mobile connectivity, which is needed for modern businesses, and how SASE technologies address these challenges – by employing zero-trust security models and remote browser isolation to safeguard against unauthorized access and cyber threats.
Possible talking points in speaker session:
1. Introduction to 5G and Wireless Network Challenges:
● Overview of 5G technology advancements and the associated
increase in connected devices.
● Discussion of new security vulnerabilities introduced by these
developments.
2. Understanding SASE:
● Definition of Secure Access Service Edge (SASE) and its importance
in modern network architectures.
● How SASE combines SD-WAN and security features into a unified
cloud-based service.
3. Core Components of SASE Solutions:
● Deep dive into zero-trust security models: principles and advantages.
● The role of remote browser isolation in protecting user endpoints
from malicious online activities.
4. Implementation and Benefits:
● Case studies on deployment strategies and the operational
challenges and solutions.
● Discussion of the tangible benefits of SASE
5. Future Trends and Predictions:
● Potential developments in SASE technology as wireless connectivity
continues to evolve.
● Discussion on AI and machine learning applications for dynamic
security measures and traffic management in SASE environments
How 5G Helps Industry 4.0 to be Wireless Without Quality of Service Compromise: Practical Test Results
Design & Performance Analysis of gPTP Time Synchronisation Through a 5G-TSN Bridge
Day 2 - November 14, 2024
9:30 AM - 11:00 AM
Opening Session
Panel Discussion
PANEL DISCUSSION: The Future of Wireless?
10:30 AM - 11:00 AM
Coffee Break & Networking
11:00 AM - 2:00 PM
Session 7
Matter
Why Matter matters
Founded in 2002, the Connectivity Standards Alliance is a Member-based organization developing and delivering universal, open standards for the Internet of Things (IoT). The organization is dedicated to simplifying the complex, creating an open path to IoT adoption and innovation, and promoting universal open standards, enabling all objects to connect and interact securely—regardless of brand. The Alliance unites global competitors and partners across the value chain to work side-by-side and create better outcomes through a collaborative culture and community.
On October 4, 2022, the Alliance unveiled Matter 1.0, a global, industry-unifying, open-source standard designed to reduce barriers to IoT device interoperability, ensuring smart home products work together seamlessly. Since that time, there have been nearly 40,000 specification downloads, with more than 2100 products certified by the Alliance. In May 2023, the Alliance released Matter 1.1, making it simpler for device makers and developers to get started with Matter and allowing for easier product certification. In October of 2023, version 1.2 was released, expanding Matter’s support with nine new device types, including refrigerators, room air conditioners, dishwashers, laundry washers, robotic vacuums, smoke/CO alarms, air quality sensors, air purifiers, and fans. This version also featured core improvements to test plans, cluster revisions, enhancement to test harness and scripts, and SDK (software development kit) and spec improvements.
Matter remains steadfast in its promise to dismantle interoperability barriers by fostering seamless collaboration among smart home devices. It forges deeper connections between more objects, simplifies development for manufacturers, and enhances compatibility for consumers. This promise is upheld through collaboration across companies worldwide, allowing for the best ideas and technologies to solve the biggest smart home problems. With this, Matter opens new doors to innovation while making it easier for consumers to connect devices.
Matter’s journey breaks ground, marking a significant milestone with an unprecedented open-source SDK approach, go-to-market commitment from major brands and platforms, and ongoing support from more than 345 global companies and over 4,500 individuals representing the entire value chain, including silicon, hardware and software, ecosystems, platforms, and retail. Matter’s footprint continues to expand across the smart home, the IoT ecosystem, and in consumer minds.
The launch of Matter was made possible through the combined efforts of Member companies such as Amazon, Apple, Comcast, Google, SmartThings, and Alliance Board Member companies IKEA, Legrand, NXP Semiconductors, Resideo, Schneider Electric, Signify, Silicon Labs, Somfy, and Wulian. If you'd like to know more about the Connectivity Standards Alliance and the more than 700 companies inspired to change the future of IoT, we would love to share our story.
On October 4, 2022, the Alliance unveiled Matter 1.0, a global, industry-unifying, open-source standard designed to reduce barriers to IoT device interoperability, ensuring smart home products work together seamlessly. Since that time, there have been nearly 40,000 specification downloads, with more than 2100 products certified by the Alliance. In May 2023, the Alliance released Matter 1.1, making it simpler for device makers and developers to get started with Matter and allowing for easier product certification. In October of 2023, version 1.2 was released, expanding Matter’s support with nine new device types, including refrigerators, room air conditioners, dishwashers, laundry washers, robotic vacuums, smoke/CO alarms, air quality sensors, air purifiers, and fans. This version also featured core improvements to test plans, cluster revisions, enhancement to test harness and scripts, and SDK (software development kit) and spec improvements.
Matter remains steadfast in its promise to dismantle interoperability barriers by fostering seamless collaboration among smart home devices. It forges deeper connections between more objects, simplifies development for manufacturers, and enhances compatibility for consumers. This promise is upheld through collaboration across companies worldwide, allowing for the best ideas and technologies to solve the biggest smart home problems. With this, Matter opens new doors to innovation while making it easier for consumers to connect devices.
Matter’s journey breaks ground, marking a significant milestone with an unprecedented open-source SDK approach, go-to-market commitment from major brands and platforms, and ongoing support from more than 345 global companies and over 4,500 individuals representing the entire value chain, including silicon, hardware and software, ecosystems, platforms, and retail. Matter’s footprint continues to expand across the smart home, the IoT ecosystem, and in consumer minds.
The launch of Matter was made possible through the combined efforts of Member companies such as Amazon, Apple, Comcast, Google, SmartThings, and Alliance Board Member companies IKEA, Legrand, NXP Semiconductors, Resideo, Schneider Electric, Signify, Silicon Labs, Somfy, and Wulian. If you'd like to know more about the Connectivity Standards Alliance and the more than 700 companies inspired to change the future of IoT, we would love to share our story.
Adopting Matter without losing your existing smart home IoT devices
The Matter standard makes bold steps forward for smart home IoT, as illustrated in other presentations in this conference, such as a common data and application model, support by major ecosystems (no more lock-in), IP-based, strong security, unified install flow, and an open source SDK.
This allows a step forward from existing standards for smart home IoT devices - but to make adoption seamless for consumers with existing smart home devices, we need to make sure that they can use the new Matter standard alongside their existing devices, such as those based on Zigbee and Z-Wave. That way, they can keep using the devices they trust and love alongside new Matter devices - protecting their previous investments as well as preventing e-waste. In other words, the new standard needs to be an evolution rather than a revolution - which is required to give those which are not early adopters trust in stability for smart home IoT offerings.
One of the obvious solutions for the manufacturer is to create and provide a software update of the existing device to include support for Matter next to the existing functionality. However, this may not be always feasible due to device resources or other reasons.
The solution we have proposed, and which got adopted into the Matter 1.0 standard, is to standardize the concept of bridging: the bridge is a Matter device which exposes the functionality of the non-Matter devices (any protocol can be supported here) as virtual Matter devices towards the Matter controllers. Those Matter controllers can thus find and use the non-Matter devices without needing to know the peculiarities of the non-Matter interface.
This bridging concept was adopted by many vendors to smooth the transition for their existing customers into Matter, and many have implemented and certified a bridge between Matter and their existing pre-Matter devices.
In the presentation, we will elaborate how the bridging concepts work for various device types such as actuators (lights, blinds, etcetera) which can be controlled from Matter controllers - as well as sensors and switches which can provide triggers to those Matter controllers.
This allows a step forward from existing standards for smart home IoT devices - but to make adoption seamless for consumers with existing smart home devices, we need to make sure that they can use the new Matter standard alongside their existing devices, such as those based on Zigbee and Z-Wave. That way, they can keep using the devices they trust and love alongside new Matter devices - protecting their previous investments as well as preventing e-waste. In other words, the new standard needs to be an evolution rather than a revolution - which is required to give those which are not early adopters trust in stability for smart home IoT offerings.
One of the obvious solutions for the manufacturer is to create and provide a software update of the existing device to include support for Matter next to the existing functionality. However, this may not be always feasible due to device resources or other reasons.
The solution we have proposed, and which got adopted into the Matter 1.0 standard, is to standardize the concept of bridging: the bridge is a Matter device which exposes the functionality of the non-Matter devices (any protocol can be supported here) as virtual Matter devices towards the Matter controllers. Those Matter controllers can thus find and use the non-Matter devices without needing to know the peculiarities of the non-Matter interface.
This bridging concept was adopted by many vendors to smooth the transition for their existing customers into Matter, and many have implemented and certified a bridge between Matter and their existing pre-Matter devices.
In the presentation, we will elaborate how the bridging concepts work for various device types such as actuators (lights, blinds, etcetera) which can be controlled from Matter controllers - as well as sensors and switches which can provide triggers to those Matter controllers.
Cancelled
Trust Matters: Attacking Smart Devices Authenticity
Matter is the new future-proof vendor-agnostic connectivity standard designed for IoT and smart building environments, already adopted by many different setups including alarm sensors, security cameras, or door locks. The standard, maintained by the Connectivity Standards Alliance (CSA) and supported by industry-leading companies, puts a lot of effort in ensuring that only trustworthy devices are admitted into the ecosystem. The goal is to prevent attacks where rogue devices join the network, gaining access to privacy sensitive information, or performing disruptive operations.
To achieve its objective, the standard defines a Device Attestation Procedure that relies on digital certificates and private keys deployed on the devices and used during commissioning to prove authenticity and standard compliance. Matter's official threat model highlights the importance of safeguarding this cryptographic material, but how have manufacturers been handling this aspect so far?
This research puts this model under test, and presents a thorough analysis of the protection mechanisms implemented in one of the first Matter commercial sensor available on the market. Our investigation revealed how both hardware and software defenses implemented by the manufacturer can be bypassed, which ultimately allowed us to compromise the confidentiality of the device’s private key. An attacker achieving the same result could potentially create counterfeit devices indistinguishable from genuine certified ones, thus jeopardizing the security and privacy of an entire Matter network.
To achieve its objective, the standard defines a Device Attestation Procedure that relies on digital certificates and private keys deployed on the devices and used during commissioning to prove authenticity and standard compliance. Matter's official threat model highlights the importance of safeguarding this cryptographic material, but how have manufacturers been handling this aspect so far?
This research puts this model under test, and presents a thorough analysis of the protection mechanisms implemented in one of the first Matter commercial sensor available on the market. Our investigation revealed how both hardware and software defenses implemented by the manufacturer can be bypassed, which ultimately allowed us to compromise the confidentiality of the device’s private key. An attacker achieving the same result could potentially create counterfeit devices indistinguishable from genuine certified ones, thus jeopardizing the security and privacy of an entire Matter network.
1:00 PM - 2:00 PM
Lunch & Networking
11:00 AM - 2:00 PM
Session 8
WSN, IIoT
Energy Autonomous CO2 Sensor Node for Indoor Applications
CO2 sensors are often used in building automation to monitor the quality of the air and to trigger measures to improve that quality. These devices have proved very useful during the recent covid-19 pandemic. They are usually powered using mains or batteries. However, there are use cases where energy autonomy is desired. In some cases, it is also advantageous to transmit the data to a central point, using an appropriate long-range protocol.
CO2 sensor using the NDIR method are among the best in terms of measurement quality and life span. However, that method requires more energy, which negatively impact energy consumption. LPWAN protocols also require more energy than short range wireless systems. All that make it difficult to design a good quality node that measures often and yet runs on harvested energy indoors.
We first analysed the energy requirements of different NDIR sensors that on the market in order to choose the most appropriate. We then used some power management techniques to optimise the power consumption of the node and to make it operate in an office environment using LoRa.
At 350 lux (artificial light) and using a solar cell of 8 cm2, sufficient energy is harvested to make measurements and transmit the data every minute, using the LoRa protocol with SF7.
We will present the challenges of the design, explain how those challenges were solved and show the results.
CO2 sensor using the NDIR method are among the best in terms of measurement quality and life span. However, that method requires more energy, which negatively impact energy consumption. LPWAN protocols also require more energy than short range wireless systems. All that make it difficult to design a good quality node that measures often and yet runs on harvested energy indoors.
We first analysed the energy requirements of different NDIR sensors that on the market in order to choose the most appropriate. We then used some power management techniques to optimise the power consumption of the node and to make it operate in an office environment using LoRa.
At 350 lux (artificial light) and using a solar cell of 8 cm2, sufficient energy is harvested to make measurements and transmit the data every minute, using the LoRa protocol with SF7.
We will present the challenges of the design, explain how those challenges were solved and show the results.
Energy Efficiency of Wireless Sensor Networks in Industrial Automation: A Case Study from the HoLoDEC Project
In industrial automation, several trends have been recently observed, such as condition monitoring in moving applications, robotics, drives, or assembly lines. In hygienic environments and retrofitting, avoiding cables can also be a major advantage. Regarding the given requirements, wireless sensor networks (WSN) offer a versatile solution and simple integration into various systems. Due to cable reduction, data transmission has to be wireless, and the power needs to be supplied by a battery. Multiple sensing devices can therefore be combined in an energy-efficient mesh topology network, where data is forwarded over different nodes to a gateway. The gateway can process the data or push it to any cloud or database. Some wireless mesh networks apply a cost-based routing related to energy consumption. To extend battery lifetime further, each node has to fulfill ultra-low power (ULP) requirements. In this presentation we will describe the design and the performance of a prototypical WSN. The selected hardware consists of a radio module, which administrates the connection to the wireless mesh network, a microcontroller to process data according to the customer's configuration and forward it to the radio module and a sensor frontend to acquire and provide data. The radio module operates at 2.4 GHz on the physical layer of Bluetooth and performs with ultra-low power consumption. The microcontroller applies various low-power modes in its different phases of the program flow. Most of the energy is consumed by the wireless transmission of data packets. To reduce energy consumption furthermore, multiple promising data compression and processing algorithms are evaluated. The additional energy consumption for data compression will be compared to the energy savings resulting from reduced wireless payload. Finally, numbers for the extended battery lifetime of the proposed WSN will be presented. This presentation is based on the HoLoDEC project, which is funded by the Bundesministerium für Bildung und Forschung under grant number 16ME0699.
A Testbed to Identify Vulnerabilities and Secure Wireless Systems in IIoT
A Testbed to Identify Vulnerabilities and Secure Wireless Systems in IIoT
The Industrial Internet of Things (IIoT) has revolutionized industries by enabling advanced automation and enhancing data processing capabilities. In addition, using wireless technology in IIoT offers numerous benefits such as flexibility, resilience, scalability, and mobility. However, the increasing connectivity and wireless communications associated with IIoT make processes more vulnerable to cyberattacks. To address this issue, we built a testbed with multiple wireless protocols to evaluate the security of IIoT devices and wireless networks. The testbed allows for identifying and assessing security vulnerabilities and developing and evaluating mitigation strategies, thereby directly impacting the security of wireless systems in IIoT.
Our research began with an extensive review of essential standards and protocols in the IIoT field. We then applied this knowledge to create a unique compatibility stack, including both wired and wireless protocols, that can be directly used by system designers in selecting the most appropriate communication protocols for IIoT applications. While our findings demonstrate the increasing adoption of wireless technologies, they also highlight the fact that very few testbeds cover wireless protocols, and even fewer consider their security aspects.
Our testbed utilized a variety of communication methods prevalent in the industry, such as Modbus, Profinet, Wi-Fi, Bluetooth, WirelessHART, and LoRaWAN. The testbed was placed inside a Faraday cage to avoid any unwanted interference to outside world from security tests. Amazon Web Services were utilized for the cloud infrastructure, incorporating critical services such as AWS IoT SiteWise, AWS IoT Greengrass, and AWS IoT Core. In addition to the current use of a broadband internet connection, we plan to expand the testbed to include 5G cellular networks for remote communications.
We are simulating several potential attack scenarios, including well-known attacks like the Mirai Botnet and attacks specified for wireless such as wireless signal analyzer and smart selective jamming attacks. These simulations are enhancing our understanding and helping us improve the security measures for IIoT devices and wireless networks.
The IIoT testbed we built provides a strong foundation for understanding vulnerabilities and developing effective mitigation strategies that enhance the security and resilience of wireless IIoT systems against evolving cyber threats.
The Industrial Internet of Things (IIoT) has revolutionized industries by enabling advanced automation and enhancing data processing capabilities. In addition, using wireless technology in IIoT offers numerous benefits such as flexibility, resilience, scalability, and mobility. However, the increasing connectivity and wireless communications associated with IIoT make processes more vulnerable to cyberattacks. To address this issue, we built a testbed with multiple wireless protocols to evaluate the security of IIoT devices and wireless networks. The testbed allows for identifying and assessing security vulnerabilities and developing and evaluating mitigation strategies, thereby directly impacting the security of wireless systems in IIoT.
Our research began with an extensive review of essential standards and protocols in the IIoT field. We then applied this knowledge to create a unique compatibility stack, including both wired and wireless protocols, that can be directly used by system designers in selecting the most appropriate communication protocols for IIoT applications. While our findings demonstrate the increasing adoption of wireless technologies, they also highlight the fact that very few testbeds cover wireless protocols, and even fewer consider their security aspects.
Our testbed utilized a variety of communication methods prevalent in the industry, such as Modbus, Profinet, Wi-Fi, Bluetooth, WirelessHART, and LoRaWAN. The testbed was placed inside a Faraday cage to avoid any unwanted interference to outside world from security tests. Amazon Web Services were utilized for the cloud infrastructure, incorporating critical services such as AWS IoT SiteWise, AWS IoT Greengrass, and AWS IoT Core. In addition to the current use of a broadband internet connection, we plan to expand the testbed to include 5G cellular networks for remote communications.
We are simulating several potential attack scenarios, including well-known attacks like the Mirai Botnet and attacks specified for wireless such as wireless signal analyzer and smart selective jamming attacks. These simulations are enhancing our understanding and helping us improve the security measures for IIoT devices and wireless networks.
The IIoT testbed we built provides a strong foundation for understanding vulnerabilities and developing effective mitigation strategies that enhance the security and resilience of wireless IIoT systems against evolving cyber threats.
Building a Smart Home Integration Business with Z-Wave and Z-Wave Long Range Technology
This business case proposes the strategic implementation of Z-Wave technology, including Z-Wave Long Range (Z-Wave LR), as the backbone for an integration business specializing in smart home solutions. The Z-Wave wireless communication protocol offers robust advantages in terms of interoperability, reliability, and market penetration. The addition of Z-Wave LR significantly enhances the range and scalability of smart home networks, providing a compelling and unmatched value proposition for residential and commercial clients. Leveraging these benefits, can help the business capture a significant regional market share in the rapidly growing smart home services industry.
1:00 PM - 2:00 PM
Lunch & Networking
2:00 PM - 4:00 PM
Session 9
Mioty
Technical benefits of mioty® and what it means in real installations
mioty® is a breakthrough wireless technology developed by the German Fraunhofer Institute for Integrated Circuits. It is a worldwide open standard for IoT, offering a low-power wide-area network (LPWAN) solution that overcomes many of the limitations of traditional LPWAN technologies. With its patented telegram splitting mechanism, mioty® divides messages into multiple sub-packages and transmits them at various times and frequencies, ensuring unprecedented robustness, scalability, and energy efficiency.
But how can these advantages be derived from the technique itself in an easy way? An answer to this is the spectral footprint of various radio technologies, and in particular of mioty® compared to other LPWAN technologies. The spectral footprint is a way to measure how efficient the use of the radio spectrum is, just like CO2 footprint is a way to measure the energy efficiency of cars. The smaller the spectral footprint is, the more IoT devices can co-exist in the same radio cell and still achieve high quality of service. A lower spectral footprint also makes a radio technology more robust against other radio signals as it’s less likely to be overlapped (=disturbed).
The observations from spectral footprints of different LPWAN technologies are theoretical but helps explaining the results observed in real field installations. By comparing mioty® with other LPWAN technologies in real scenarios, telegram splitting transfers it’s technique into higher range and reliability of data transmission by using less energy per message. These advantages translate into cost-effective long-term investment, allowing users to reduce capital and operating costs and benefit from a low total cost of ownership (TCO). Hence, mioty is ideally suited for large scale IoT installations like Smart Cities, Utilities (with large metering installations), Industries etc. Due to all advantages that mioty® offers, it is ideally suited to be the new global IoT radio standard.
But how can these advantages be derived from the technique itself in an easy way? An answer to this is the spectral footprint of various radio technologies, and in particular of mioty® compared to other LPWAN technologies. The spectral footprint is a way to measure how efficient the use of the radio spectrum is, just like CO2 footprint is a way to measure the energy efficiency of cars. The smaller the spectral footprint is, the more IoT devices can co-exist in the same radio cell and still achieve high quality of service. A lower spectral footprint also makes a radio technology more robust against other radio signals as it’s less likely to be overlapped (=disturbed).
The observations from spectral footprints of different LPWAN technologies are theoretical but helps explaining the results observed in real field installations. By comparing mioty® with other LPWAN technologies in real scenarios, telegram splitting transfers it’s technique into higher range and reliability of data transmission by using less energy per message. These advantages translate into cost-effective long-term investment, allowing users to reduce capital and operating costs and benefit from a low total cost of ownership (TCO). Hence, mioty is ideally suited for large scale IoT installations like Smart Cities, Utilities (with large metering installations), Industries etc. Due to all advantages that mioty® offers, it is ideally suited to be the new global IoT radio standard.
Large scale Smart City networks with mioty
Smart city networks are the dominant application for Low Power Area Networks. Many IoT applications in cities can be addressed with a single wireless network, but metering applications like water, gas or heat meters are still mostly using Wireless M-Bus protocol today. Wireless M-Bus today is used for short and medium range communication for walk-by or drive-by readout of meter devices. The Open Metering System Group (OMS Group e.V.), an interest group developing standards for communication interfaces for metering systems based on Wireless M-Bus, has recently published OMS Generation 5 specification, which now includes long range communication protocols, called OMS LPWAN. The TS-UNB protocol of mioty, specified in ETSI TS 103357 is part of this OMS LPWAN specification and called splitting mode. With the rollout of metering applications in smart cities using LPWAN, the number of network devices significantly increases, while the requirement on Quality of Service (QoS) is still paramount.
This presentation will give an overview on OMS Generation 5 and discuss the challenges of large scale smart city networks including utilities. It will show how mioty and its innovative channel access scheme Telegram Splitting Multiple Access (TSMA) overcomes the limitations of existing random channel schemes like ALOHA, which are common for most smart city and smart metering wireless protocols today. Examples of recent smart metering rollouts with mioty will be given, showing the benefit of the technology. Furthermore, it outlooks to future applications and challenges like control of actuators or over the air update of IoT device and how they could be integrated into smart city networks sustaining the high QoS.
This presentation will give an overview on OMS Generation 5 and discuss the challenges of large scale smart city networks including utilities. It will show how mioty and its innovative channel access scheme Telegram Splitting Multiple Access (TSMA) overcomes the limitations of existing random channel schemes like ALOHA, which are common for most smart city and smart metering wireless protocols today. Examples of recent smart metering rollouts with mioty will be given, showing the benefit of the technology. Furthermore, it outlooks to future applications and challenges like control of actuators or over the air update of IoT device and how they could be integrated into smart city networks sustaining the high QoS.
Revolutionizing Industrial Safety: mioty-powered Wireless Alerting Solutions
The "WAS4WOS" project, a collaboration between Swissphone Wireless AG and Fraunhofer Institute for Integrated Circuits (IIS), aims to develop, and enable the offering of a new generation of wireless communication systems for worker safety and critical messaging. Funded by the Federal Ministry for Education and Research (BMBF) in Germany and Innosuisse in Switzerland, the project is conducted within the Eurostars Programme powered by EUREKA and the European Community. The project endeavours to create fail-safe systems capable of ultra-reliable communication for both stationary and mobile applications.
Central to the project's objectives is the adoption of mioty® as the LPWAN radio technology, famous for its robustness, low power consumption, and exceptional range. Mioty® introduces several innovative features, including Class B and C operations, enabling bidirectional communication, multicast support, and low latency. By harnessing the capabilities of mioty®, the project seeks to establish a communication infrastructure that ensures seamless connectivity and effective transmissions of critical alerts in diverse industrial environments.
Furthermore, the project's scope extends beyond basic safety functionalities to include features for automatic event monitoring and reporting. By integrating bidirectional IoT applications, the system enables real-time monitoring and control of various industrial processes, enhancing operational efficiency and productivity.
Presentation Points:
1. Introduction to the project and its applicable use cases
2. Adoption of mioty® as the LPWAN radio technology, highlighting new features such as Class B and C operations, multicast support, and low latency
3. Showcasing field test results and learnings regarding indoor coverage, localisation, robustness, and latency
Central to the project's objectives is the adoption of mioty® as the LPWAN radio technology, famous for its robustness, low power consumption, and exceptional range. Mioty® introduces several innovative features, including Class B and C operations, enabling bidirectional communication, multicast support, and low latency. By harnessing the capabilities of mioty®, the project seeks to establish a communication infrastructure that ensures seamless connectivity and effective transmissions of critical alerts in diverse industrial environments.
Furthermore, the project's scope extends beyond basic safety functionalities to include features for automatic event monitoring and reporting. By integrating bidirectional IoT applications, the system enables real-time monitoring and control of various industrial processes, enhancing operational efficiency and productivity.
Presentation Points:
1. Introduction to the project and its applicable use cases
2. Adoption of mioty® as the LPWAN radio technology, highlighting new features such as Class B and C operations, multicast support, and low latency
3. Showcasing field test results and learnings regarding indoor coverage, localisation, robustness, and latency
3:30 PM - 4:00 PM
Coffee Break & Networking
2:00 PM - 4:00 PM
Session 10
WSN, IIoT
Wireless Mesh Secrets: How Physics Schooled the Engineers
IQRF® is a wireless mesh technology and standard, ideal for adding effortless wireless connectivity to any product. It excels in mesh network routing reliability, enabling seamless communication over hundreds of routers for dependable, scalable integration.
Wireless Mesh Networks (WMNs) and associated technologies is our primary area of expertise. WMNs are distinguished by routing and scalability. Given the technical challenges associated with reliable wireless mesh routing, IQRF® engineers have designated this issue as a priority.
IQRF® has efficiently resolved challenging routing in WMNs and defined a new standard for WMN´s reliability. The IQMESH® routing protocol facilitates robust message transmission across extensive networks, supporting up to 255 hops and a coverage range 500 meters per 1 device. It was specifically developed to improve network reliability as the wireless system scales. IQRF® is great for extensive and high-density environments.
IQRF® was introduced in 2004 as a wireless mesh technology delivering industrial reliability, simple integration, ultimate security, interoperability, and the IQRF True Low Power® as key values. Since then, it has evolved in a huge ecosystem consisting of transceivers, development tools, devices, products, software, firmware, documentation, cloud services, protocols, huge IP, and ready-to-use systems.
The IQRF® is a perfect fit for a wide range of applications, including lighting, facility management, industrial automation, heating control, environmental sensing, and more.
After two decades in the market, the IQRF® communication standard was introduced, and its specifications were released. This enables all users to access and implement all the technical achievements under one royalty-free license.
Wireless Mesh Networks (WMNs) and associated technologies is our primary area of expertise. WMNs are distinguished by routing and scalability. Given the technical challenges associated with reliable wireless mesh routing, IQRF® engineers have designated this issue as a priority.
IQRF® has efficiently resolved challenging routing in WMNs and defined a new standard for WMN´s reliability. The IQMESH® routing protocol facilitates robust message transmission across extensive networks, supporting up to 255 hops and a coverage range 500 meters per 1 device. It was specifically developed to improve network reliability as the wireless system scales. IQRF® is great for extensive and high-density environments.
IQRF® was introduced in 2004 as a wireless mesh technology delivering industrial reliability, simple integration, ultimate security, interoperability, and the IQRF True Low Power® as key values. Since then, it has evolved in a huge ecosystem consisting of transceivers, development tools, devices, products, software, firmware, documentation, cloud services, protocols, huge IP, and ready-to-use systems.
The IQRF® is a perfect fit for a wide range of applications, including lighting, facility management, industrial automation, heating control, environmental sensing, and more.
After two decades in the market, the IQRF® communication standard was introduced, and its specifications were released. This enables all users to access and implement all the technical achievements under one royalty-free license.
Improving Reliability and Throughput of IO-Link Wireless
Wireless communication is an enabler for many applications in the field of industrial automated production, Industry 4.0 and digital twins. It enables sensor technology on rotating or moving parts and is the basis for mobility and flexibility in the cooperation of transport and processing units. At the field or control level, there are high communication requirements for a wireless system, as real-time requirements and extremely low transmission errors often have to be met here, often in conjunction with control and safety requirements that can hardly be met by many standard wireless systems.
IO-Link Wireless defines a reliable, real-time capable and deterministic protocol for control systems in industrial factory automation. IO-Link Wireless can be used to transmit signals from switching and measuring sensors as well as from simpler actuators. However, IO-Link Wireless reaches its limits when the amount of data to be transmitted increases, e.g. with multi-axis inertial measurement units or when the movement of the sensors also leads into areas of poor radio connection, e.g. behind obstacles that strongly attenuate the radio signals.
The suitability of IO-Link Wireless for such difficult applications has been investigated as part of projects. Measurements of packet error rates were carried out at the level of bit transmission (physical layer) and media access (medium access control layer). The error statistics can be used to evaluate the error recovery protocol with packet repetition (ARQ) provided in the standard. In addition, an alternative error protection based on network-based coding was designed, implemented and evaluated. Furthermore, cooperative communication methods were investigated by using a repeater for IO-Link Wireless, which has not yet been included in the standard.
In summary, the paper presents investigations and possibilities for using IO-Link Wireless effectively and reliably even at higher, continuous data rates and under difficult radio conditions due to spatial constraints.
IO-Link Wireless defines a reliable, real-time capable and deterministic protocol for control systems in industrial factory automation. IO-Link Wireless can be used to transmit signals from switching and measuring sensors as well as from simpler actuators. However, IO-Link Wireless reaches its limits when the amount of data to be transmitted increases, e.g. with multi-axis inertial measurement units or when the movement of the sensors also leads into areas of poor radio connection, e.g. behind obstacles that strongly attenuate the radio signals.
The suitability of IO-Link Wireless for such difficult applications has been investigated as part of projects. Measurements of packet error rates were carried out at the level of bit transmission (physical layer) and media access (medium access control layer). The error statistics can be used to evaluate the error recovery protocol with packet repetition (ARQ) provided in the standard. In addition, an alternative error protection based on network-based coding was designed, implemented and evaluated. Furthermore, cooperative communication methods were investigated by using a repeater for IO-Link Wireless, which has not yet been included in the standard.
In summary, the paper presents investigations and possibilities for using IO-Link Wireless effectively and reliably even at higher, continuous data rates and under difficult radio conditions due to spatial constraints.
Wireless OoB Debugging for IoT Devices
Most test tasks in the development of a new wireless IoT device are usually carried out directly at the developer workstations or in special laboratories, but not in the actual application environment. As a result, many products and solutions do not really work optimally and error-free in the market launch phase. In addition to commissioning errors, the causes of these practical problems are often completely different environmental conditions and the associated disturbance variables, as well as typical continuous operation problems (e.g. critical memory fragmentation, unexpected restarts, etc.). In practice, the details and causes of such errors can only be diagnosed and rectified in extensive field test phases. This requires special hardware and software configurations as well as context-related test concepts.
Using a typical IoT gateway example with one wireless sensor and one 4G/5G cellular connection to the Internet, the presentation will show how the quality of a new wireless IoT application can be significantly improved through field test phases in the pilot customers' application environments and sophisticated test tools adapted to the respective task. In the example presented, the responsible development team uses debugging-capable remote access via GDB/gdbserver to important key components via a separate out-of-band (OoB) radio connection. The application bridge between the IoT gateway to be tested (device under test, DuT) and the OoB communication link is a special debug and test proxy system. It has a USB or SWD connection to the DuT, an SDR interface for the wireless sensor network and the option of switching the power supply to the DuT on and off. The debug and test proxy also measure electrical parameters (voltage, current). In addition, the debug and test proxy system can run automated monitoring and diagnostic functions to support developers in detecting complex error scenarios. The lecture provides the following two examples on this topic:
SNR/airtime monitoring: Using a periodic signal-to-noise ratio (SNR) measurement (signal-to-noise ratio or signal-to-noise ratio), determine whether the frequency band is of sufficient quality for communication with the environmental sensors and whether the requirements for the respective intended IoT wireless application are met. An SDR-based measurement method is used for this, with the raw data being run through various algorithms to detect anomalies.
System condition monitoring: A context-related parameter set (feature set) is defined that describes the respective system status as precisely as possible within a unit of time (e.g. number of Tx/Rx packets/bytes for all communication interfaces, memory status, CPU utilization, airtime of the WSN and WWAN interfaces, energy consumption, etc.). The parameters are periodically recorded and evaluated by monitoring and classification functions.
Using a typical IoT gateway example with one wireless sensor and one 4G/5G cellular connection to the Internet, the presentation will show how the quality of a new wireless IoT application can be significantly improved through field test phases in the pilot customers' application environments and sophisticated test tools adapted to the respective task. In the example presented, the responsible development team uses debugging-capable remote access via GDB/gdbserver to important key components via a separate out-of-band (OoB) radio connection. The application bridge between the IoT gateway to be tested (device under test, DuT) and the OoB communication link is a special debug and test proxy system. It has a USB or SWD connection to the DuT, an SDR interface for the wireless sensor network and the option of switching the power supply to the DuT on and off. The debug and test proxy also measure electrical parameters (voltage, current). In addition, the debug and test proxy system can run automated monitoring and diagnostic functions to support developers in detecting complex error scenarios. The lecture provides the following two examples on this topic:
SNR/airtime monitoring: Using a periodic signal-to-noise ratio (SNR) measurement (signal-to-noise ratio or signal-to-noise ratio), determine whether the frequency band is of sufficient quality for communication with the environmental sensors and whether the requirements for the respective intended IoT wireless application are met. An SDR-based measurement method is used for this, with the raw data being run through various algorithms to detect anomalies.
System condition monitoring: A context-related parameter set (feature set) is defined that describes the respective system status as precisely as possible within a unit of time (e.g. number of Tx/Rx packets/bytes for all communication interfaces, memory status, CPU utilization, airtime of the WSN and WWAN interfaces, energy consumption, etc.). The parameters are periodically recorded and evaluated by monitoring and classification functions.
3:30 PM - 4:00 PM
Coffee Break & Networking
4:00 PM - 5:30 PM
Session 11
LoRa, LoRaWAN
LoRaWAN Unleashed: Maximizing Performance and Scalability
This expert presentation explores how to maximize the benefits of LoRaWAN, explaining its performance and scalability in all directions, number of sensors, and space coverage. It also outlines the key elements of the LoRaWAN specification, highlights its strong security features, and emphasizes its advantage of being fast and easy to install.
FUOTA over LoRaWAN, implementation and practical hints
This presentation explains FUOTA over LoRaWAN, beginning with a brief introduction about LoRaWAN's structure and implementation. It covers the basic functions added to LoRaWAN to support FUOTA, demonstrates how FUOTA operates, and discusses when it should be used. The presentation also addresses system extensions needed for FUOTA, the challenges involved, and the power budget requirements for its use.
LoRaWAN enables developments in direct-to-satellite IoT, making remote connections viable for all
We have an abundance of solutions for wireless connectivity within the populous regions : the Smart Cities and Buildings, but huge swathes of the planet are left unconnected and host massive scale industrial processes such as logistics, agriculture, utilities and distributed infrastructure. These industries crave visibility of their processes yet are left neglected and unconnected because legacy systems were uneconomic.
Innovations in low-power satellite IoT offer Systems Integrators with viable new technologies to address these challenges by using the LoRaWAN protocol.
Innovations in low-power satellite IoT offer Systems Integrators with viable new technologies to address these challenges by using the LoRaWAN protocol.
4:00 PM - 5:30 PM
Session 12
Localisation (UWB, Bluetooth)
Evaluation of Ultra-Wideband and Bluetooth Technologies for Low-Power and Precise Localization
The recent years have witnessed a surge of interest in wireless ranging and localization complementary to GNSS to navigate or geo-reference objects in areas without satellite coverage, provide higher precision or reactivity, or to circumvent the cost of additional chips, their excessive size or power consumption. Apple and Samsung smartphones have paved the way by including Ultra-Wideband (UWB) solutions, assessing the commercial promises of applications such as indoor real-time localization systems (RTLS), asset tracking, smart locks, and keyless car entry.
Several ranging methods are available for use in wireless systems based on received signal strength (RSSI), time of flight (ToF) and channel sounding (CS). Other localization techniques such as time difference of arrival (TDoA) or angle of arrival (AoA) typically exploit multiple antenna configurations. These are supported by a wide range of wireless technologies, targeting different applications, and suited for use with different ranging and localization techniques.
This work focuses on the evaluation of localization techniques based on two wireless standards intended for battery operated devices and short to medium range, Bluetooth and Bluetooth Low Energy (LE) and UWB (IEEE802.15.4z), comparing them in terms of power consumption, accuracy, latency, and cost. The connectivity champions Bluetooth and Bluetooth LE are now being extended by more performant ranging and localization capabilities, competing with UWB. The UWB is dedicated to higher data rates and optimized for precise and secure ranging, at the typical cost of an order of magnitude higher peak power consumption, larger silicon area and lower link budget. Nevertheless, the two-way ToF employed in UWB devices provides a time and energy efficient means to determine distance with a cm-level precision. The CS technique employed by Bluetooth, requires a long time to sample the different frequencies and convert them to time-domain response, making it slower and requiring more energy. In addition, the available bandwidth of 80 MHz in the ISM band limits the achievable resolution to more than 1 m. However, with the planned allocation of the new 6 GHz band to Bluetooth, the performance is expected to approach that of UWB, albeit at an additional cost in power consumption and complexity.
Depending on the application requirements, either Bluetooth or UWB will have an edge. UWB is a technology of choice when precision and efficiency are paramount, while Bluetooth can be seen as a smaller and cheaper alternative, offering connectivity to many devices. The ranging speed, and support for TDoA make the UWB better suited for scaling, allowing to quickly localize many devices. The two technologies can also be perceived as complementary and merged to leverage the Bluetooth connectivity and the UWB localization capabilities, expanding on the concept of narrowband-assisted UWB localization and paving the way for future applications.
Several ranging methods are available for use in wireless systems based on received signal strength (RSSI), time of flight (ToF) and channel sounding (CS). Other localization techniques such as time difference of arrival (TDoA) or angle of arrival (AoA) typically exploit multiple antenna configurations. These are supported by a wide range of wireless technologies, targeting different applications, and suited for use with different ranging and localization techniques.
This work focuses on the evaluation of localization techniques based on two wireless standards intended for battery operated devices and short to medium range, Bluetooth and Bluetooth Low Energy (LE) and UWB (IEEE802.15.4z), comparing them in terms of power consumption, accuracy, latency, and cost. The connectivity champions Bluetooth and Bluetooth LE are now being extended by more performant ranging and localization capabilities, competing with UWB. The UWB is dedicated to higher data rates and optimized for precise and secure ranging, at the typical cost of an order of magnitude higher peak power consumption, larger silicon area and lower link budget. Nevertheless, the two-way ToF employed in UWB devices provides a time and energy efficient means to determine distance with a cm-level precision. The CS technique employed by Bluetooth, requires a long time to sample the different frequencies and convert them to time-domain response, making it slower and requiring more energy. In addition, the available bandwidth of 80 MHz in the ISM band limits the achievable resolution to more than 1 m. However, with the planned allocation of the new 6 GHz band to Bluetooth, the performance is expected to approach that of UWB, albeit at an additional cost in power consumption and complexity.
Depending on the application requirements, either Bluetooth or UWB will have an edge. UWB is a technology of choice when precision and efficiency are paramount, while Bluetooth can be seen as a smaller and cheaper alternative, offering connectivity to many devices. The ranging speed, and support for TDoA make the UWB better suited for scaling, allowing to quickly localize many devices. The two technologies can also be perceived as complementary and merged to leverage the Bluetooth connectivity and the UWB localization capabilities, expanding on the concept of narrowband-assisted UWB localization and paving the way for future applications.
Ready for the next wave of UWB
Over the last years UWB based on the IEEE 802.15.4z standard has been massively adopted for secure ranging and proximity applications.
In the meantime, IEEE is working on the specification for the next generation, making UWB ready for many more applications beyond ranging.
So the new IEEE 802.15.4ab standard focusses on advancements of ranging capabilities, on enhanced modulation schemes, standardized sensing, low latency communication and more.
In this session we will provide a sneak preview on the new features and discuss the impact on testing.
In the meantime, IEEE is working on the specification for the next generation, making UWB ready for many more applications beyond ranging.
So the new IEEE 802.15.4ab standard focusses on advancements of ranging capabilities, on enhanced modulation schemes, standardized sensing, low latency communication and more.
In this session we will provide a sneak preview on the new features and discuss the impact on testing.
Ultra-Wideband based Indoor Positioning for Smartphones
Complex buildings such as airports, train stations, and fairs are challenging areas where people struggle with orientation and finding their desired destinations. Often, these orientation problems lead to bad experiences, such as when a family member or even a child cannot be found, or if a connecting flight is missed due to losing time while finding the desired gate.
Fortunately, indoor navigation systems similar to the well-known and widely used GPS outdoors are now available. However, these systems require a high level of expertise for planning and installation and struggle with the necessary scalability when a large number of users are involved.
To address this, Pinpoint developed an Ultra-Wideband (UWB) based, easy-to-install positioning system that hides all the complexity of positioning technology. This allows system integrators and device manufacturers without expert knowledge to design new products with location awareness.
The system uses a decentrally organized mesh network that employs UWB as the radio technology. Within this network, each building satellite transmits UWB beacons and can wirelessly synchronize to 1-nanosecond accuracy using a patented algorithm. UWB-enabled smartphones and positioning modules can receive these UWB beacons to synchronize themselves to the network time and calculate their own position with an accuracy of 30 cm. With this technology, the aforementioned use cases can be realized by integrating the building satellite modules into existing building devices, such as light bulbs or fire detectors, and the positioning modules into objects, such as Automated Guided Vehicles (AGVs), employee badges, or simply by using UWB-enabled smartphones.
Fortunately, indoor navigation systems similar to the well-known and widely used GPS outdoors are now available. However, these systems require a high level of expertise for planning and installation and struggle with the necessary scalability when a large number of users are involved.
To address this, Pinpoint developed an Ultra-Wideband (UWB) based, easy-to-install positioning system that hides all the complexity of positioning technology. This allows system integrators and device manufacturers without expert knowledge to design new products with location awareness.
The system uses a decentrally organized mesh network that employs UWB as the radio technology. Within this network, each building satellite transmits UWB beacons and can wirelessly synchronize to 1-nanosecond accuracy using a patented algorithm. UWB-enabled smartphones and positioning modules can receive these UWB beacons to synchronize themselves to the network time and calculate their own position with an accuracy of 30 cm. With this technology, the aforementioned use cases can be realized by integrating the building satellite modules into existing building devices, such as light bulbs or fire detectors, and the positioning modules into objects, such as Automated Guided Vehicles (AGVs), employee badges, or simply by using UWB-enabled smartphones.