Raum: 43.2.411
Tel.: +49 (0)731 50 26211
joschua.conrad(at)uni-ulm.de
M.Sc. Joschua Conrad
Joschua Conrad erhielt 2016 den Bachelor-Abschluss an der Dualen Hochschule Baden-Württemberg (DHBW) Stuttgart. In seinem Studium und seiner Tätigkeit an der Eisenmann SE in Böblingen und Stuttgart entwarf er objektorientierte Software für Anwendungen mit Echtzeitanforderungen in der Automatisierung. Während seines Masterstudiums arbeitete er am Institut für Mikroelektronik der Universität Ulm und entwickelte bei der Gigatronik GmbH in Ulm einen Hoch-SNDR-Filter und einen Hoch-SNDR-Oszillator mittels Leiterplatten und entwickelte Softwarelösungen für das Requirements Engineering. Seine Masterarbeit beendete er 2019 am Institut für Mikroelektronik mit dem Thema "Design of a Ring Amplifier based Sigma Delta Modulator".
Er arbeitet heute unter der Aufsicht von Prof. Dr.-Ing. Maurits Ortmanns im Bereich "Maschinelles Lernen".
Studentische Arbeiten
[mt] = Masterarbeit, [rp] = Bachelorarbeit
Aktuelle Arbeiten
- Nour Elshahawy
Evaluation of Methods for Benchmarking and Re-Using SRAM Memory[mt]
Abgeschlossene Arbeiten
- Johannes Stark
Implementation of an In-Memory-Compute Circuit for the Inference of Neural Networks[mt] - Kilian Storch
Evaluation of DRAM Links for Neural-Network Inference-Accelerators[mt] - Simon Wilhelmstätter
Design and Implementation of the Dataflow for a Versatile Neural-Network Inference-System[mt] - Simone Steinhauser
Investigation of the Data-Flow in a Neural-Netwok Inference System[rp] - Rawan Hagag
Investigation and Design of Comparator Architectures for a SAR ADC in 28nm CMOS[rp] - Luca Krüger
Analyzing the Influence of Neural-Network Hyperparameters on the Resilience over Mixed-Signal Hardware Errors[rp] - Biyi Jiang
Modeling of Neural-Network Processing-Element Hardware on Algorithmic Level[mt] - Paul Kässer
Development and Test of a Mixed-Signal Neural-Network Processing-Element[mt] - Franjo Lovric
Evaluation of System-Level Structures for Neural-Network Accelerator Systems[mt]
Publikationen
2024
Differentiable Cost Model for Neural-Network Accelerator Regarding Memory Hierarchy
IEEE Transactions on Circuits and Systems I: Regular Papers ( Early Access )
Oktober 2024
DOI: | 10.1109/TCSI.2024.3476534 |
Confidence Estimation and Boosting for Dynamic-Comparator Transient-Noise Analysis
22nd IEEE Interregional NEWCAS Conference (NEWCAS)
September 2024
DOI: | 10.1109/NewCAS58973.2024.10666354 |
Enabling Power Side-Channel Attack Simulation on Mixed-Signal Neural Network Accelerators
IEEE International Conference on Omni-Layer Intelligent Systems (COINS), London, UK
Juli 2024
Stability Prediction of Δ∑ Modulators using Artificial Neural Networks
IEEE International Symposium on Circuits and Systems (ISCAS), Singapore
Mai 2024
DOI: | 10.1109/ISCAS58744.2024.10557868 |
Too-Hot-to-Handle: Insights into Temperature and Noise Hyperparameters for Differentiable Neural-Architecture-Searches
6th IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Abu-Dhabi, UAE
April 2024
DOI: | 10.1109/AICAS59952.2024.10595971 |
Attacking a Joint Protection Scheme for Deep Neural Network Hardware Accelerators and Models
6th IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Abu Dhabi, UAE
April 2024
DOI: | 10.1109/AICAS59952.2024.10595935 |
Multi-conditioned Graph Diffusion for Neural Architecture Search
Transactions on Machine Learning Research
März 2024
ISSN: 2835-8856
Weblink: | https://openreview.net/forum?id=5VotySkajV |
2021
Nonlinearity Modeling for Mixed-Signal Inference Accelerators in Training Frameworks
28th IEEE International Conference on Electronics, Circuits, and Systems (ICECS), pp. 1-4
2021
DOI: | 10.1109/ICECS53924.2021.9665503 |
2020
Design Approach for Ring Amplifiers
IEEE Transactions on Circuits and Systems I: Regular Papers
April 2020
DOI: | 10.1109/TCSI.2020.2986553 |