State-of-the-art workstation for large foundation model research,
Intel Core Ultra 7, 20 cores, 5.3GHz,
128 GB DDR5-5600 RAM,
NVIDIA RTX A6000 PRO 96GB VRAM,
4TB internal NVMe SSD.
State-of-the-art Edge-AI development kits,
2048-core NVIDIA Ampere architecture GPU with 64GB tensor cores,
12-core Arm® Cortex®-A78AE v8.2 64-bit CPU,
2x NVDLA v2.0,
PVA v2.0,
64GB 256-bit LPDDR5,
64GB eMMC 5.1.
Memory interface: LPDDR4 via DMA
CPU: 32bit ARM M.4
NPU: 20 x Akida 1 Neuron mesh
Peak INT8 GOPs: 1.5 TOPs
On-chip memory: 8MB high-speed near compute SRAM
Clock frequency: 300MHz
Operating temperature: 0 – 70°C
Thermal solution: no fan or heatsink required
Typical application power: 1W
World’s smallest and most power-efficient Event-based vision sensor.
The sensor module connects directly to Raspberry Pi 5 via the camera connector MIPI CSI-2 (D-PHY) interface
320×320 array of 6.3μm contrast detection pixels
Ultra high-speed event data output (equivalent to >10kfps time resolution) with 1μs-precision time stamping
Low latency <150 μs at 1,000 lux, <1,000 μs at 5 lux
Native high dynamic range >140 dB
Low power consumption <50 mW
State-of-the-art biosensing platform.
32 channels,
24-bit high-fidelity biosensors (EEG, EOG, EMG, and other physiological signals)
Medium EEG cap.
Opensource biosensing platform.
4/8 channels,
24-bit high-fidelity biosensors (EEG, EOG, EMG, and other physiological signals)
PicoScope 5000 Oscilloscope and Function Generator (200MHz analog bandwidth, up to 16bit resolution and 1GS/s, 20MHz function generator).
JLink Debuggers (Ultra and Edu versions).
Lab power supply.
Multi-meters.
Our electronic and mobile development station has a variety of:
Mobile phones (Android and iOS).
Arduinos (Uno, Mega, Nano BLE Sense, etc.)
Raspberry Pis.
Software defined radio platforms (USRP B200)
Blood pressure and EKG monitors.
Biopotential sensing platforms (OpenBCI Cyton, Ganglions)
Bioimpedance sensing platforms (MAX30009EVAL)
Robotic Car.
And various development kits from ST Electronics, Nordic Semiconductors, Microchip, etc.
We have a dedicated station for rapid prototyping that includes:
Fully-equipment soldering and rework stations,
Fume extractors,
Hot plate,
And various tools and materials (e.g., flux, desoldering, IPA, conductive threads, etc.)
The Smart Home Lab (SHL) is a key research facility designed, developed and managed by the IoT group. It is a physical space (currently located in 5.26 Abacws) comprised of over 170 networked smart home devices. It has six hotdesk working areas where researchers (staff and students) can conduct various types of research with smart home devices. It comprises a wide range of devices such as smart TV, environmental monitoring kettles, various smart speakers, robotic vacuum cleaners, smart metre bells, door locks and many more. We use OpenHab and Home Assistant to capture semantic data and Wireshark to capture network traffic and related network communications and behaviours. The smart home lab also comprises a video network connected to six different types of smart home and consumer CCTV cameras to a network video recorder that allows connecting video analysis research in relation to the smart home domain. We also have a dedicated data science workstation to support extensive machine learning and deep learning research using smart home data.
(1) 3 x Workstation with NVIDIA RTX A6000 GPUs (48GB VRAM)
(2) 3 x Workstation with NVIDIA RTX A4090 GPUs (24GB VRAM)
Details will be updated soon.
(3) RTX4070 workstation:
Core i7,
32GB RAM,
1TB SSD,
RTX 4070 TiSuper, 16GB VRAM.
Windows 11.
(4) SPIKE workstation (shared with CardiffIoT Group):
Xeon W-2245 (8 cores, 16 threads, up to 4.7GHz),
128GB RAM,
256GB NVMe SSD,
RTX A6000, 48GB VRAM.
Ubuntu 20.04 LTS.
(1) 6 x Prusa MK4S 3D Printers.
(2) Soldering stations.
(3) Laser cutter.