English / Japanese

xDR Challenge2023 LOGO

xDR Challenge2023
~PDR and BLE with LiDAR based GT~

xDR Challenge is a series of indoor localization competition by PDR benchmark standardization committee mainly focused on integrated indoor localization methods with dead-reckoning. xDR Challenge 2023 will be held as an official competition track of IPIN 2023 (Track 5). Our track is categorized as "off-site, off-line" track, which means that we provide datasets for indoor localization to competitors and the competitors are asked to submit estimated trajectories of targets in certein time limit. Target field of this year’s competition is commercial facilities and the targets of localization are pedestrian walking in the field. We arranged totally about 100 BLE beacons in the field for correcting error of PDR. We adopted LiDAR for collecting ground-truth of position of the targets. We will evaluated the estimated trajectories of the target by utilizing multi-faceted evaluation metrics with the precise ground-truth data by LiDAR.
(URL:)https://unit.aist.go.jp/harc/xDR-Challenge-2023/index.html

Definition of data and trials used in this track


Training data, which are provided as off-line files and are used to calibrate the system (provided at Dataset section).

Testing trials, which are provided through the EvaalAPI and are reloadable. Competitors can run them as many times as they like to evaluate and fine-tune their system as well as to get used to the EvaalAPI.

Scoring trials, which are provided through the EvaalAPI and are only available at specific times during the competition production. Those are used to evaluate the competitors. Some of the scoring trials include data from a subject not in the training data.

* These are based on definitions in Track 3 of IPIN Competition with slight modification for our track.

Important Dates

Technical annex published
April (published on April 29th)
Pre-Admission
Started in May (accepting application now)
Training data published
June (started providing the training data)
Evaluation tools published
August
official application
By August 31
Scoring trials
The middle of September, 2023
Winner proclamation:
25−28 September, 2023 (During IPIN Conference)

News

  • (Sept. 8th, 2023) Evaluation details are added. Evaluation tools are shared script on github
  • (Aug. 29th, 2023) FAQs about the scoring trials is added.
  • (Aug. 28th, 2023) E-mail address to submit the technical description is added. (Step3 of "How to Participate")
  • (Aug. 25th, 2023) Announcement for the official application and registration are added (in "Important Dates" and "How to Participate")
  • (July 21th, 2023) Unveiled the prize for the winner
  • (July 12th, 2023) Included gyro's drift information from Android in the training data. Also, an example of PDR, integrated localization has been added.
  • (July 3rd, 2023) Included geomagnetic information of the facility in the readme of the training data
  • (June 26th, 2023) Started providing the entire training data for those who registered pre-registration
  • (June 6th, 2023) Started providing the first sample data for those who registered pre-registration
  • (May 26th, 2023) Started accepting pre-admission on our pre-registration

Overview

xDR Challenge 2023 ~PDR and BLE with LiDAR based GT~ (Hereafter referred to as xDR Challenge 2023) will be held as a sequel competition of PDR/xDR Challenge series which had been hosted by PDR Benchmark standardization committee. As in previous years' competitions, the xDR Challenge 2023 will be held as an official competition tracks of the IPIN conference. There are three categories of tracks in IPIN 2023 competition. Our track is categorized as an "Off-site, Off-line" track.

The off-site competition means that
- Competition organizers have conducted measurement and prepare dataset for the competition, and
- The dataset measured will be provided to competitors, and
- The competitors are required to estimate the targets movement by applying their own algorithm and submit the estimated results.
The off-line competition means that
- the competitors are not required to submit the results in real-time, but to submit in certain (longer) time limit, and
- the competitors can obtain whole data of each sequence for applying global optimization while constraining with provided information.

The datasets of our track consists of sensor data required for the PDR-based indoor localization. The data are measured in the commercial facilities in a highway rest spot. We measured pedestrian movement by using Android devices. We collected sensor (gyro, accelerometer, magnets sensor) data as well as the BLE signals from the BLE beacons.

Similar to the previous PDR/xDR Challenges, submitted trajectories will be evaluated by multi-faced evaluation metrics. It is notable that we collected highly precise ground-truth of the targets movements by using a hand-held LiDAR. The detailed evaluation metrics will be announced at a later date.

xDR Challenge 2023 will be conducted closely together with other tracks of the IPIN competitions under a common schedule. We also adopt common tool named EvAAL API for sharing dataset and receiving results. As aligned with other tracks, real competition will be conducted at a day in the middle of September 2023. We will provide training data and testing trials in same format with the real dataset for allowing the competitors to prepare and adjust localization algorithm/systems before the real competition.

We look forward to your participation!

Competition Details

Target Environments: Commercial facility

This year's target environment is a commercial facility. This facility is an express way service area. It has two buildings; one is two-story and the other is one-story. We utilized MyBeacon (Aplix) as BLE beacons. BLE signals are emitted every 0.1 sec. The beacon locations will be provided as (x, y, z) location in coordinates of the floor map.

Measuring Devices and Data Distribution

9-axis IMU sensor data used for the competition is measured by AQUOS Sense 6 (SHARP). The ground truth of the location was measured with a hand-held LiDAR (GeoSLAM ZEB-Horizon) at approximately 100 Hz. The ground truth will be disclosed only for training data. The data will be distributed over EvAAL API. Overview of the provided data are shown in Table 1.

Table 1. Overview of the data

Data type Measuring device Rate available in training data available in scoring trials
Acceleration AQUOS Sense 6 Approx. 100 Hz yes yes
Angular velocity AQUOS Sense 6 Approx. 100 Hz yes yes
Magnetism AQUOS Sense 6 Approx. 100 Hz yes yes
BLE RSSI AQUOS Sense 6 Emitted from beacons at 10 Hz, recorded when received by AQUOS Sense 6. yes yes
Ground truth location (x, y, z) ZEB-Horizon Approx. 100 Hz yes only start and end
Ground truth orientation (quaternion) ZEB-Horizon Approx. 100 Hz yes only start and end
Ground truth floor name - 1 floor name for each path yes yes

Requirements for competitors

The competitors are required to estimate the movement (trajectories) of the subjects walking in the facility by using their original indoor localization algorithm. We believe that PDR-based indoor localization algorithm is a suitable for tracking the subjects as targeted in this competition. We provide the sensor data required for the PDR-based indoor localization algorithm as the dataset of the off-site competition. We asked the subjects to bring Android device to collect sensor (gyro, accelerometer, magnets sensor) data as well as the BLE signals from the BLE beacon. The competitors are required to submit results of the indoor localization (time-stamped x, y coordinates). We will evaluate the submitted results by our multifaceted evaluation metrics. In order to evaluate the performance of PDR only (EAG, etc.), we will ask competitors to submit evaluation data in two phases as follows;
Scoring trials without BLE RSSI
First, we will provide data without BLE reception information to calculate the evaluation index for relative positioning. Only the position and orientation of the starting and ending points will be provided with correct values (we provide this information through EvAAL API). Competitors submit the results using this data within a specific term and then the EvAAL API server closes after the term.
Scoring trials with BLE RSSI
Second, we will provide data that includes BLE reception information to calculate the absolute positioning evaluation index, which is the same data set as the first session except that it includes BLE reception information. Again, competitors submit the results using this data within a specific term and then the EvAAL API server closes after the term.

Evaluation Framework

We are standardizing evaluation of indoor localization especially for algorithms based on xDR in the PDR benchmark standardization committee. The evaluation framework applied in this competition was determined according to the discussion in the committee. PDR benchmark standardization committee (PDRBMSC) is discussing and proposing standard evaluation framework for evaluating indoor localization methods or systems. The final evaluation metrics and integrated index will be calculated with indicators and negative check. The evaluation scripts used for the competition are now available on PDRBMSC's github repository. The brief introduction of this year's evaluation indicators and negative checks are as follows;


Table 2. Overview of evaluation indexes

Name of Index Corresponding indicators Description
I_ce CE (Circular Error) Checking the absolute positional error between trajectory and ground truth at check points. 
I_ca CA_l (Circular Accuracy in the local space) Checking the deviation of the error distribution in local x-y coordinate system
I_eag EAG (Error Accumulation Gradient) Checking the speed of error accumulation from the correction points
I_ve VE (Velocity Error) Checking the error of velocity compared with correct velocity of ground-truth. I_ve is evaluated by calculating average of velocity errors in 1sec. time window (+/- 0.5 sec from the evaluation points) compared with GT. This averaging operation is intended to smooth peaky high-frequency fluctuation of the velocity.
I_obstacle Requirement for Obstacle Avoidance Checking the percentage of points of the trajectory in walkable area

Formula for evaluating indexes and the integrated index for determining winner
The winner of the competition is determined by the integrated index I_i which is calclulated by weighted-sum of the indexes. The formulas for elemental and the integared indexes are shown in Table 3. The indexes are ranging within 0 to 100. The indexes are linearly changing between conditions of max. index and min. index. The weights for calculating I_i are shown in Table 4.

Table 3. Formulas for elemental and integrated indexes

Name of Index Condition of Max. Index Condition of Man. Index Formula
I_ce ce < 1.0 30 < ce 100 - (100 * (ce - 1.))/29
I_ca ca = 0.0 10 < ca 100 - (10 * ca)
I_eag eag < 0.05 2.0 < eag 100 - (100 * (eag - 0.05))/1.95
I_ve ve < 0.1 2.0 < ve 100 - (100 * (ve - 0.1))/1.9
I_obstacle obs = 1.0 obs = 0.0 100 * obs
I_i W_ce * I_ce + W_ca * I_ca + W_eag * I_eag + W_ve * I_ve + W_obstacle * I_obstacle

Table 4. Weights for elemental indexes

Name of Weight Corresponding Index Value of Weight
W_ce I_ce 0.25
W_ca I_ca 0.20
W_eag I_eag 0.25
W_ve I_ve 0.15
W_obstacle I_obstacle 0.15

Frequency of the evaluation
Note that frequency of the evaluation depends on the frequency of the ground-truth data. The frequency of the ground-truth data for xDR Challenge 2023 is about 100Hz. If the sampling frequency of your estimation is less than 100Hz, your estimation can not be accurately evaluated. We recommend you to estimate trajectories in 100Hz or to up-sample the trajectories to 100Hz.

In addition to our evaluation indicators, CE75 will be provided as a reference, for consistency with the other IPIN competition Tracks. Note that the CE75 is NOT used for determining the winners. The evaluation frameworks and scripts for calculating scores for determining the winners will be provided from our GitHub account. The evaluation scripts can calculate CEs with arbitrary percentiles.

How to Participate

Step1 Request for Admission

If potential competitors have an interest in our competition. Please register pre-admission in our pre-registration form. We will provide you our sample after we confirm your registration

Step2 Downloading training data and try testing trials

We will provide ID and password to our training data for those you completed pre-admission. You can download the dataset with provided ID and password. After downloading the sample data, the competitor can start development. Also, we will provide opportunities to run testing trials through EvAAL API with the servers hosted by competition organizers.

Step3 Application of the Competition (Deadline Aug. 31, 2023)

If you decide to join the competition, please register final admission in EvAAL's official application form. The competitors are required to provide short and long descriptions of the system. Please describe required inforamtion in the application form and send a short (2-4 pages) Technical description of your localization system by e-mail to the track5 organziers' ML (M-xDR-Challenge2023-ml@aist.go.jp)

Step4 Registration of the conference and payment of registration fee

In addition to the application, the competitors are required to complete conference registration and pay the registration fee. Please complete your registration and payment of the fee in the registration page. If competitors only join the off-site competition track, there is the designated type of registration for them.

Step5 Scoring trials

On a track-specific day during the second week of September, the competitors are required to join the scoring trials. We will provide unreferenced data for scoring trial through EvAAL API. Competitors will be asked to download sensors data all at once and have a limited time to upload all estimates at once. We will have following two scoring trials one after another. See the requirements for competitors for more detail.

Step 5a Scoring trials without BLE RSSI
Step 5b Scoring trials with BLE RSSI
Step6 Announcement of the results

The result of the xDR Challenge 2023 will be announced during the IPIN 2023 conference

Dataset

Training data (last update : 2023 July 12th)

ID and password are provided after pre-registration.

README
README (Japanese)
Figures of README
Data

FAQs

Frequently Asked Question about the Scoring Trials

Q.Who is the subject of the scoring trial?
A. The subject numbers for the scoring trials will not be disclosed before the competition. The scoring trials contain both subjects included in the training data and those not included.
Q.Which terminal was used for measuring dataset of the scoring trial?
A. The scoring trials include only terminal number 51.
Q. Can we get calibration data that the subjects stand still ?
A. We did not intentionally measure such calibration data. At the start and end of the data, there might be periods with no activity. These periods might be included in the Scoring trial, but are not guaranteed.

Prize

Thanks to our sponsors,the winner will be awarded 1000€. (NEW)

Reference

  • Takuto Yoshida, Katsuhiko Kaji, Satoki Ogiso, Ryosuke Ichikari, Hideaki Uchiyama, Takeshi Kurata, and Nobuo Kawaguchi: A Survey of Ground Truth Measurement Systems for Indoor Positioning, Journal of Information Processing, 2023.
  • Francesco Potorti, et. al.: Off-line Evaluation of Indoor Positioning Systems in Different Scenarios: The Experiences from IPIN 2020 Competition, IEEE Sensors Journal,2021
  • Takeshi Kurata, Takashi Maehata, Hidehito Hashimoto, Naohiro Tada, Ryosuke Ichikari, Hideki Aso, and Yoshinori Ito: IoH Technologies into Indoor Manufacturing Sites,Proceedings of Advances in Production Management Systems Conference, 2019
  • Ryosuke Ichikari, Katsuhiko Kaji, Ryo Shimomura, Masakatsu Kourogi, Takashi Okuma, and Takeshi Kurata: Off-Site Indoor Localization Competitions Based on Measured Data in a Warehouse, Sensors, vol. 19, issue 4, article 763, 2019.
  • Takeshi Kurata, Ryosuke Ichikari, Ryo Shimomura, Katsuhiko Kaji, Takashi Okuma, and Masakatsu Kourogi: Making Pier Data Broader and Deeper: PDR Challenge and Virtual Mapping Party, MobiCASE 2018 (9th EAI International Conference on Mobile Computing, Applications and Services), 2018.
  • Masakatsu Kourogi and Tomohiro Fukuhara: Case studies of IPIN services in Japan: Advanced trials and implementations in service and manufacturing fields in special session "Value Creation in LBS (Location-Based Services)", IPIN 2017.

Important Links

Organizing Committee

  • General Co-Chair: Ryosuke Ichikari, Ph.D., AIST, Japan
  • General Co-Chair: Satoki Ogiso, Ph.D., AIST, Japan
  • Genaral Co-Chair: Akihiro Sato, Ph.D., AIST, Japan
  • International Liaison Co-Chair, Antonio Ramon Jimenez Ruiz, Ph.D., CSIC-UPM, Spain,
  • International Liaison Co-Chair, Soyeon Lee, Ph.D., ETRI, Republic of Korea
  • Industrial Liaison Chair, Takeshi Kurata, Ph.D., AIST & Univ. of Tsukuba, Japan

Contact

e-mail: M-xDR-Challenge2023-ml@aist.go.jp

Sponsors

Supporting Communities

  • PDR-Benchmark LOGO
  • HASC LOGO
  • HMHS LOGO
  • IPNJT LOGO

xDR Challenge series