From the first spark of inspiration to the final forward-looking horizon, this thesis unfolds as a journey to re-imagine the foundations of computation. We merge breakthroughs in materials science, electronic device engineering, and deep generative learning to confront three of
...
From the first spark of inspiration to the final forward-looking horizon, this thesis unfolds as a journey to re-imagine the foundations of computation. We merge breakthroughs in materials science, electronic device engineering, and deep generative learning to confront three of modern computing grandest challenges: the energy inefficiencies of classical architectures, the scaling limitations of neuromorphic hardware, and the exponential complexity of quantum systems.
We begin by identifying a threefold bottleneck at the heart of contemporary information processing. On one hand, the von Neumann architecture separates memory from logic, incurring high energy and latency costs. On another, quantum systems, with their exponentially expanding state spaces, defy conventional methods of characterization and control. Bridging these extremes demands both new materials and new paradigms: architectures that think and learn within memory itself and operate seamlessly across room-temperature and cryogenic domains. Our mission is to forge a unified computing framework that fuses neuromorphic principles, cryogenic and room-temperature memristors, and machine intelligence with quantum state tomography (QST).
The conceptual groundwork follows. Inspired by biological neurons, we explore how computation and memory can coexist within memristive architectures. Memristors, particularly resistive switching devices such as HfO2-based ReRAM, emulate synaptic plasticity, enabling analog tuning and in-memory processing. We investigate both spiking and non-spiking neural models, contextualizing their use in QST. The core idea of computation-in-memory (CiM) emerges, performing neural operations directly within dense memristor crossbars, bypassing the von Neumann bottleneck. This unifying concept becomes the architectural backbone of our hybrid classical–quantum platform.
Our theoretical framework spans silicon physics, memristive mechanisms, and the formalism of quantum state reconstruction. We dissect electron-beam-induced processing (EBIP) as a route to room-temperature silicon device fabrication. We examine the physics of OxReRAM switching, ion migration, interfacial engineering, and energy barriers, and we extend this understanding to cryogenic regimes. In parallel, we articulate the formal structure of QST, density matrices, POVMs, and data scaling as 4^N for N qubits, where neural networks emerge as natural generative or inference engines. Variational autoencoders, especially spiking VAEs (SVAEs), form the probabilistic bridge between neuromorphic learning and quantum reconstruction. The materials narrative begins with innovation in silicon processing. Abandoning high-temperature furnaces, we deploy spin-coated liquid polysilanes and transform them into functional amorphous silicon films via focused EBIP. STEM–EELS imaging, residual-gas analysis, and electrical characterization confirm uniform, low-defect films exhibiting stable ohmic behavior over months. This approach enables nanoscale precision and compatibility with flexible substrates, key for next-generation neuromorphic hardware.
ReRAM devices, long hampered by high-voltage electroforming and poor uniformity, are re-engineered. By designing Pd/HfO2 interfaces, we realize forming-free OxReRAM cells that switch at sub-2V, support multibit states, and retain data over 10^4 s. Atomic-scale analysis reveals a Pd–O–Hf interfacial layer that stabilizes low-bias conductive pathways. These devices achieve endurance and energy consumption in the picojoule range, validating them as efficient synaptic elements for in-memory computing. At cryogenic temperatures, the same memristive principles enable a new frontier: Cryo-Memristors for spin-qubit control. Operating reliably at 4 K, Pt/Ti/HfO2-based memristors and their modified variants (M-PtHT) serve as low-noise, multi-bit programmable gain elements for scalable quantum control electronics. Embedded near the quantum layer, these devices synthesize analog bias voltages with sub-100\,µV resolution, drastically reducing the wiring complexity, heat load, and latency in large-scale qubit arrays. Statistical analysis shows linear resistance variation and stable multi-bit retention even at 4\,K, confirming their potential as cryogenic analog memory elements for autonomous qubit tuning and adaptive quantum feedback. This chapter bridges device physics and quantum hardware, demonstrating that memristive programmability can extend beyond neuromorphic computing into the quantum domain.
We confront QST through the lens of machine learning. A diverse suite of neural architectures, FCN, CNN, RNN, RBM, CGAN, and Transformer, is deployed to reconstruct quantum states from simulated measurement data. Among them, CNNs deliver the best trade-off between fidelity and computational time, especially under expectation-based measurements. Yet, the SVAE architecture marks a turning point: as a generative probabilistic model, it achieves high-fidelity reconstructions even under sparse and noisy data, generalizing to higher qubit counts (up to 8) and scaling sub-exponentially in runtime. Its latent-space encoding of high-dimensional quantum information renders it ideal for real-time, energy-efficient inference when implemented on memristive crossbars. Simulations incorporating real device characteristics confirm that our forming-free and cryogenic OxReRAM-based CiM arrays can physically sustain deep QST networks. Memristor crossbars perform rapid in-memory matrix–vector multiplications, reducing inference energy by orders of magnitude compared to digital processors. Together, these results establish a scalable, hardware-aware path toward hybrid classical–neuromorphic–quantum computing.
We conclude by reflecting on the broader implications. This work demonstrates that room-temperature EBIP enables sustainable silicon fabrication; that forming-free OxReRAM devices can be engineered for reliable analog switching; that Cryo-Memristors enable scalable, low-power qubit control; and that generative neural networks, especially SVAEs, offer a pathway to efficient, hardware-embedded quantum state reconstruction. Looking ahead, these innovations converge toward cryogenic integration with quantum processors, adaptive quantum feedback via spiking neuromorphic circuits, and the eventual realization of intelligent, energy-aware quantum systems.
Through every chapter, one theme resounds: the dissolution of boundaries, between memory and logic, between classical and quantum, between matter and model. This thesis lays the foundation for a new kind of computing, one that learns like the brain, reasons like a physicist, and computes like the future demands.