1 

Estimating the extreme value index for imprecise data
In extreme value theory the focus is on the tails of the distribution. The main focus is to estimate the tail distribution for a rounded data set. To estimate this tail distribution the extreme value index should be estimated, but due to the rounded data this extreme value index oscillates heavily. Therefore a correct estimate can not be obtained. By adding a small uniform stochast the rounded data can be smoothend out and in this way the oscillation can be cancelled.

[PDF]
[Abstract]

2 

Investigation of Different Solvers for Radiotherapy Treatment Planning Problems
Radiotherapy treatment planning involves solving inequality constrained minimization problems. The currently used interior point solver performs well, but is considered relatively slow. In this thesis we investigate two different solvers based on the logarithmic barrier method and Sequential Quadratic Programming (SQP) respectively. We argue that the behaviour of the logarithmic barrier solver is uncertain, thereby making it generally unreliable in this context. In addition we substantiate that the performance of the SQP solver is solid, but lacks efficiency in computing the minimizers of its related quadratic subproblems.
We conclude that without serious improvements, none of the solvers investigated are faster than the currently used interior point optimizer.

[PDF]
[Abstract]

3 

Determining Performance Boundaries and Automatic Loop Optimization of HighLevel System Specifications
Designers are confronted with high timetomarket pressure and an increasing demand for computational power. As a result, they are required to identify as early as possible the quality of a specification for an intended technology. The designer needs to know if this specification can be improved, and at what cost. Specification tradeoffs are often based on the experience and intuition of a designer, which in itself is not enough to make design decisions given the complexity of modern designs. Therefore, we need to identify the performance boundaries for the execution of a specification on an intended technology.
The degree of parallelism, required resources, scheduling constraints, and possible optimizations, etc. are essential in determining design tradeoffs (e.g., power consumption, execution time, etc). However, existing tools lack the capability of determining relevant performance parameters and the option to automatically optimize highlevel specifications to make meaningful design tradeoffs.
To address these problems, we present in this thesis a new profiler tool, cprof. The Clang compiler frontend is used in this tool to parse highlevel specifications, and to produce instrumented source code for the purpose of profiling. This tool automatically determines, from highlevel specifications, the degree of parallelism of a given source code, specified in C and C++ programming languages. Furthermore, cprof estimates the number of clock cycles necessary to complete a program, it automatically applies loop optimization techniques, it determines the lower and upper bound on throughput capacity, and finally, it generates hardware execution traces. The tool assumes that the specification is executed on a parallel Model of Computation (MoC), referred to as a Polyhedral Process Network (PPN).
The proposed tool adds new functionality to existing technologies: the estimated performance by cprof of PolyBench/C benchmarks, as compared to realistic implementations in FieldProgrammable Gate Arrays (FPGA) platforms, showed to be almost identical. Cprof is capable of estimating the lower and upper bound on throughput capacity, making it possible for the designer to make performance tradeoffs based on real design points. As a result, only the highlevel specification is used by cprof to assist in Design Space Exploration (DSE) and to improve
design quality.

[PDF]
[Abstract]

4 

Optimal boundary point control for linear elliptic equations
This master thesis is concerned with optimal boundary point and function control problems for linear elliptic equations subject to control constraints. The elliptic partial differential equation with Robin boundary condition is considered. The control is chosen as a linear combination of the Dirac delta functions in the point control problem. The weak formulation and optimality conditions are obtained for the function control problem. The main goal is to examine the existence of the weak solution and to derive optimality conditions of boundary point control. Introducing sufficient discretization methods such as a finite volume and a finite element methods, we obtain finite dimensional problem.
We apply efficient numerical methods including primaldual active set strategy, projected gradient and conjugate gradient methods. The test examples are presented clarifying the performance of numerical methods mentioned earlier. The conjugate gradient method is compared to the unconstrained matlab function QUADPROG and primaldual active set strategy is compared to the constrained QUADPROG. The projected conjugate gradient method is applied to improve the projected gradient method. Due to its importance, the sparse point and function control problems are studied. We apply primaldual active set strategy where the sparse parameter f is chosen differently. All of the results are presented in both point and function control problems.

[PDF]
[Abstract]

5 

Investigation of Different Solvers for Radiotherapy Treatment Planning Problems
Radiotherapy treatment planning involves solving inequality constrained minimization problems. The currently used interior point solver performs well, but is considered relatively slow. In this thesis we investigate two different solvers based on the logarithmic barrier method and Sequential Quadratic Programming (SQP) respectively. We argue that the behaviour of the logarithmic barrier solver is uncertain, thereby making it generally unreliable in this context. In addition we substantiate that the performance of the SQP solver is solid, but lacks efficiency in computing the minimizers of its related quadratic subproblems. We conclude that without serious improvements, none of the solvers investigated are faster than the currently used interior point optimizer.

[PDF]
[Abstract]

6 

Design of a Boost DCDC Converter for Energy Harvesting Applications in 40nm CMOS Process
DCDC converters are critical building blocks in energy harvesting systems which are applied to provide the energy for the implantable biomedical devices. They are required to meet very strict specifications and consume as less power as possible. Therefore, their power conversion efficiency and stability of the functionality in the varying environment become the major considerations in this thesis project, the target of which is to design a DCDC converter for energy harvesting applications.
The conventional PWM control is not usually suitable for the DCDC converters applied in energy harvesting applications because of its bad stability and low power conversion efficiency over wide input voltage and load current ranges. It is demonstrated that the adaptive ontime/offtime (AOOT) control proposed in this thesis is an excellent alternative to deal with the issue and the zero current switching (ZCS) adjustment technique can be applied to improve further the performance of the DCDC converter by the fine tuning of the offtime.
In this thesis, a systematic design flow of a boost DCDC converter has been presented from the design of the power plant, to the selection of the most suitable control technique, then to the transistorlevel implementation and finally to the layout design. Moreover, the circuitry of a boost DCDC converter and the layout of its most parts have been implemented in TSMC 40nm CMOS process. The postlayout simulation results prove that the proposed boost DCDC converter can generate a stable 1V output voltage with very small ripples (<~10mV) and achieve more than 90% (maximum about 95%) power conversion efficiency over a wide input voltage range (0.35V~0.65V) and a wide load current range (1mA~10mA).

[PDF]
[Abstract]

7 

Optimal distributed point control of linear elliptic equations
In this master thesis, we consider the optimal function and point control problems governed by linear elliptic partial differential equations together with bilateral control constraints. The aim of this work is to choose a control function by linear combination of the Dirac delta and solve the optimal distributed point control problem. In this work, optimality systems of point and function control problems are derived by Lagrangian principle and reduced functional respectively. The optimality system is discretized by the finite element method (FEM) and finite volume method (FVM). We apply a semismooth Newton (SSN) method which is equivalent to a Primal dual active set strategy (PDASS) to solve the discretized optimality system. As a second solution method, we propose Projected gradient (PG) method for the same problem. For each method, we compare the result of FEM and FVM and give a preference. In order to have the best solution method, we compare the results of PDASS and PG methods to the results of matlab command QUADPROG. We have to solve two partial differential equations in every iteration, namely the state and the adjoint equations. Therefore, we develop a Multigrid Preconditioned Conjugate Gradient (MGPCG) method for solving the discretized optimality system as fast as possible. Finally, we consider the linear elliptic optimal control problems with L1 norms in the cost functional which results sparse control problem. Due to L1 norm, objective functional becomes nondfferentiable and the optimal controls are identically zero on large parts of the control domain. Using an appropriate smoothing of the nondifferentiable terms for the cost functional, we solve the optimal sparse control problems theoretically and numerically.

[PDF]
[Abstract]

8 

DICOMmunicator: An Online Medical Image Transfer Service
Medical doctors and researchers have to send medical image files to each other. This may be for a consult, because a patient is treated by a different doctor or because the files are to be used for research. The files contain personal health information, and privacy rules have to be satisfied before the files may be sent. To securely send the files the data should be encrypted and anonymized.
The problem led to the development of DICOMmunicator, a single page web application. It allows clinicians to use the selfdescriptive application to send DICOM files to another doctor. DICOMmunicator removes patient information from the files that are uploaded, encrypts the files and stores them in a database. The recipient of the files is sent an email containing the key used for decryption allowing only that person to download the files.

[PDF]
[Abstract]

9 

Nano Optomechanical Readout for Microcantilever Sensors
This thesis demonstrates a novel displacement sensor with photonic structure a micro ring resonator with a micro cantilever. This device uses the working principle of micro ring resonator and transfer the mechanical signal into optical signal. The goal of this project is to design, fabricate and measure such a device. Fabrication process is not so complex, but as processing at chip level introduces additional complexity as not all machines can handle small pieces, several problems needed to be settled and most time was spent on fabrication, this is discussed in detail. This new design should have a same sensitivity as the micro ring resonator.

[PDF]
[Abstract]

10 

Radar Doppler Polarimetry of Wind Turbines using the Sband PARSAX radar
Due to the necessity of more sources of renewable wind generated energy, the number of wind turbines in the Netherlands has grown in the past. Unfortunately, big sizes and movements of the blades negatively impact groundbased Doppler radars in the form of Doppler clutter. Such clutter leads to a downgrade of the surveillance radar performance in airplanes and precipitation detection, their parameters estimation. Our goal is to characterize and possibly eliminate the Doppler clutter from the wind turbine, where polarization is used as an important tool to reach this goal. As Doppler frequency and polarization are key characteristics for this study, a theoretical model based on these characteristics is built to predict the behavior of the rotated rotor blades of the wind turbine which shows promising results for amplitude and phase analysis. To evaluate this model, real measurements have been done with the PARSAX radar, which gives possibility to represent the wind turbine data in Dopplerrange domain or Dopplertime domain in terms of amplitudes and phases of all four elements of the polarization scattering matrix. Using orthogonal LFMwaveforms for simultaneous polarimetric measurements, despite of huge benefits, has one serious drawback a different residual phases in different polarimetric channels. Testing and analysis of a few algorithms, the use of zero Doppler frequency range profile of phases with different algorithms of phase unwrapping, for their estimation and compensation did not result, unfortunately, in reliable compensation of residual phases. As a result, the analysis of wind turbines clutter has been focused on analysis of amplitude polarization characteristics only. For an experimental study a wind turbine was selected, which is located in Zoeterwoude near a highway. Such location allows to observe scattering from wind turbine and automotive targets simultaneously at the same range distance, with the possibility of data comparison. The goal of data analysis is to find polarimetric features, which for both types of targets behave differently, there is a possibility to eliminate the clutter from the wind turbine only. For comparative analysis, the absolute terms of the averaged covariance matrix are used in terms of 2D histograms to find differences between the wind turbine and automotive targets. This is done for consecutive time frames, where the results show different and similar behavior depending on the time frame. Another approach is to obtain and compare the so called target Euler parameters, which are related to a physical of the specific target to extend our study, a few polarimetric decomposition techniques (Pauli and H/®) are used to study feasibility of targets classification. Using comparative analysis with the covariance matrix, shows great potential with correlation coefficient algorithms in combination with polarization. The results are promising, but vary as the correlation coefficient highly depends on the vehicle and the orientation angle of the blades. The results of direct estimation of Euler parameters and the H/® decomposition both show differences between vehicles and blades and therefore potential in distinguishing both targets. Though, the results are affected by the residual phase and therefore additional research is recommended on this problem for better reliability.

[PDF]
[Abstract]

11 

Visionbased velocity control on a Philips Experimental Robot Arm
The challenge in this thesis is to find out if an offtheshelf embedded system can replace an offtheshelf laptop or desktop computer when its task is to perform visionbased velocity control using inverse kinematics on a robotic arm.
The results of this thesis are that an algorithm was developed which had to be tested in simulation and should run (semi)autonomously on an embedded system but there are no good test results on the algorithm.
Developing and testing an algorithm using an existing simulation proves to be very problematic as the used simulation software is very complex and has gone out of support by its developers.
Although the embedded system was chosen because it is equiped with a digital signal processor, I sadly found out that its proprietary driver is mutually exclusive with robotmessaging middleware, when it comes to operating systems ́ kernel support: the choice was between the driver by using an old kernel or the middleware by using a new kernel. The latter was chosen.
A realtime software kernelpatch necessary to communicate with the robotic arm unfortunately was still in development in the final stage of this work.
Porting an inverse kinematics algorithm from Matlab to C++ and adapting the trajectory generating algorithm for middleware went well, but could not be tested thoroughly because of simulation and realtime issues. This also holds for testing the velocity control algorithm.
The conclusion of this report is that there is future work necessary in order to see if the developed algorithm for visionbased velocity control actually works.

[PDF]
[Abstract]

12 

Robust Solutions for the ResourceConstrained Project Scheduling Problem: Understanding and Improving Robustness in Partial Order Schedules produced by the Chaining algorithm
Robustness is essential for schedules if they are being executed under uncertain conditions. In this thesis we research robustness in Partial Order Schedules, which represent sets of solutions for instances of the ResourceConstrained Project Scheduling Problem. They can be generated using a greedy procedure called Chaining, which can be easily adapted to use various heuristics. We use an empirical method to gain understanding of robustness and how the chaining procedure adds to this.
Based on the findings of an exploratory study we develop three models, each capturing aspects of robustness on a different level. The first model describes how a single activity is affected by various disturbances. The second model predicts how structural properties of Partial Order Schedules can reduce the effect of these disturbances. The third model describes how heuristics for the chaining procedure can influence these properties.
Using experimental evaluation, we found that the model is not complete. Experimental results did conform to the expectations set by the third model, but not of the second model. We therefore suspect that it is too simplistic for accurate predictions, but since it does match earlier observations we believe it is a good starting point for further understanding.

[PDF]
[Abstract]

13 

An Online Patient Scoring System
At this moment when a clinician needs to conduct a study he will send allhis patients at various moments a questionnaire form. After retrieving theseforms the clinician will manually process all the data by hand. This is a time consuming job and could be automatized. During this project we created an application that automates this process. When using the developed application the only thing a clinician has to do is create forms, add patients and tell the system when a patient has to receive a form.

[PDF]
[Abstract]

14 

Suggesting Queries using QueryFlow Graphs to find Dutch Content with Curated Tags
One of the standard features of today’s major Web search engines are query suggestions, which aid the user in the formulation of their search queries. Over the years, a number of different approaches have been proposed that have com monly been evaluated in the standard Web search setting. In this thesis, we build a query suggestion pipeline based on query log data collected from a more con strained environment which, though also largescale, differs considerably from standard Web search with respect to its users, indexing process and Web cover age. We implement a number of suggestion approaches based on queryflow and termquery graph models and investigate the extend to which we can replicate the results in the literature in this more constraint environment. In the process, we investigate the implementability and replicability of published Webbased querylog approaches and experiments. We find that it is possible to apply the query suggestion techniques to a constrained environment, but a tradeoff be tween suggestion usefulness and query coverage is introduced when considering suggestion effectiveness.

[PDF]
[Abstract]

15 

Ask the Right Expert: Question Routing based on User Expertise in Web Questions Answering Systems
Question Routing systems aim at routing questions to users that are more suited to answer them. Different techniques are used to match candidate users to questions, by considering properties of both. Existing techniques however do not consider the expertise of the candidate.
This work proposes an approach to Question Routing in which the user expertise is considered for question routing purposes. The proposed approach is a three stage process which allows for different configurations of existing matching techniques and user expertise. An experiment is set up in order to compare different Question Routing configurations. In total thirteen different configurations are evaluated, all based on three different contentbased baseline methods. Stack Overflow is used as the source for questions, answers and users for the evaluation of the performance of different Question Routing configurations. A dataset containing 6 months worth of questions is used for the evaluation. The results show that incorporating expertise into Question Routing algorithms can provide significant performance increase.

[PDF]
[Abstract]

16 

Quantification of deformation and sliding of orbital fat during rotation of the eye
The anatomy of the eye is well known, however the mechanical properties and behaviour of the eye are not well known. Knowing these properties could potentially be an improvement for preoperative
planning in orbital surgery. To get a better understanding of the mechanical properties of the eye while it is in movement a Finite Element Model (FEM) has been developed at the TU Delft by Schutte et al. [1]. For the FEM it is important to quantify what part of movement is sliding and what part is deformation because both are modelled in a different manner.
We present a complete pipeline to quantify sliding from MRI image volumes. The registration methods previously applied to the orbit are unable to deal with sliding. Therefore, two methods developed for registration problems where sliding occurs are tested for applicability to the orbit using phantoms. The method developed by Berendsen et al. [2], or sliding splines method, performs best on the phantoms, and is applied to the orbit. The results of the sliding splines method are compared to results from regular Bsplines registration which has been the registration method used in the orbit up until this project. Amelon et al. [3] have developed a method to characterize sliding from displacement fields, which has never been applied to the orbit. We applied this method to the displacement field generated by the regular Bsplines registration and the sliding splines method. Comparison of both registration methods, using the characterization method, shows that methods previously applied to the orbit are unable to preserve sliding in their displacement field. The sliding splines method is able to preserve sliding, but also introduces sliding in regions where it does not occur. We conclude that further study into registration methods that allow for sliding is necessary. Especially, into the sliding splines method since it is able to preserve sliding and shows promising results.

[PDF]
[Abstract]

17 

Modeling of Collection and Transmission Losses of Offshore Wind Farms for Optimization Purposes
Despite the recent technological improvements, the cost of off shore wind energy has not yet reached competitive levels. Optimization tools designed to reduce the overall losses in off shore wind farms (OWFs) may reduce its cost of energy. Hence, fast and accurate loss models are
necessary for the optimization of the OWFs design.
The aim of this thesis is the development of such models to calculate the power losses in the OWF cables, transformers and converters of the collection and transmission systems (HVAC or HVDC). The models need to capture the main steadystate loss sources while being computationally light for the optimization process. All the proposed models consider the main sources of losses during steadystate operation as well as their dependence to temperature. The cable
model considers the type of soil surrounding it whereas the converter model takes into account the IGBT datasheet information to compute the power losses.
A 640 MW OWF situated at 50, 100 and 150 km from shore is designed and employed to showcase the joint operation of the models and assess the impact of different variables such as the type of soil or the voltage level of the collection system to the power losses. In addition, the proposed models are compared against standard models (in which temperature dependence is not considered) for both the HVDC and HVAC transmission schemes.
The higher precision of the proposed models means designers may better predict the OWFs profitability.

[PDF]
[Abstract]

18 

One world is not enough: Development of a software system for connecting ZigBee devices to an IoT gateway
With the dawn of the era of the Internet of Things(IoT) more everyday objects are equipped with technology that allows them to be connected to the Internet and therefore enabling (remotely) controlling or gathering data from these objects.
DSP Innovation B.V. aims to respond on this emerging development by introducing a second generation IoT gateway to be used in combination with their SWYCS energy management solutions.
During this project a software system for this new gateway has been developed, which allows several thirdparty ZigBeeenabled devices to connect to the gateway.
The software system allows for remotely retrieving realtime device data and modifying the configurations of devices connected to the IoT gateway.
Measurement data of the devices connected to the gateway is automatically stored by the developed software solution and regularly pushed to the DSP cloud servers via a secure connection.
The historical data in the DSP cloud servers can be used by the already existing SWYCS software solutions to provide their customers with insights of their energy usage and automatically assisting them in achieving energy savings in their company.
To come to the design of the software system for the IoT gateway research was done into the various communication protocols that could be used and the hardware that was used.
For the current application, the ZigBee communication protocol with the ZigBee Home Automation stack was found to be the most appropriate choice, even though the software system also supports connecting lighting appliances operating using ZigBee Light Link.
At the end of the project a fully functional prototype of the software system has been delivered.
While the delivered system is a prototype it is not productionready yet.
An automated software installation and update process should be integrated and support for more devices should be added when the system is prepared for production and presentation to customers.

[PDF]
[Abstract]

19 

Online learning algorithms: methods and applications
In this research we study some online learning algorithms in the online convex optimization framework. Online learning algorithms make sequential decisions where there is no information about the future events in time. To measure the success of a decision there is a (bounded) cost for each decision, which is revealed after every time step. The goal of these algorithms is to perform roughly as well as the best fixed decision in hindsight and this will be measured by calculating what we will call the regret.
The algorithms that we studied are the Multiplicative Weights Method, the Weighted Majority Algorithm, the Online Gradient Descent Method and the Online Newton Step Algorithm. All of these methods have their own applications and regret bounds. Which are also studied and in particular, we give a proof for the Multiplicative Weights Method and the Weighted Majority Algorithm.
Furthermore, we provide an implementation of the Online Gradient Descent Method on a portfolio management problem where we have four stocks and we seek the best investment strategy to obtain the biggest profit over the stocks. This has lead to a wealth increase by a factor 1.37 after four years of investing where the best fixed distribution over the stocks in hindsight leads to a wealth increase of 1.68. Further analysis shows that the implementation works better then we could be expecting beforehand.

[PDF]
[Abstract]

20 

DokterService: A Security Information and Event Management System (SIEMS)
KPMG is a multinational firm, which is focused on three pillars: tax advisory, accountancy and advisory. The advisory section, in particular the IT Advisory, required a system that is able to analyze, process and visualize captured network traffic. This network traffic is gathered from the own company or their clients. The goal of this project is to design and develop a "Security Information and Event Management System (SEAMS)" that manages Network Intrusion Detection services and provides a report specifying the condition of the analyzed network. In order to manage the different Intrusion Detection Systems (IDS) their results need to be merged. Every IDS produces a different result based on their focus. To gain the most valuable information of the network traffic, one has to merge all IDS’s results. The major challenge in developing such a SEAM System is that the system should be able to process an enormous amount of data. This data is roughly between 2 10TB. The stakeholder required the system to be built in a modular way. This leads to both maintainability and adaptability of the system since the customer accounts for the extension of the product. During the research the development team acquired a large amount of knowledge about network traffic. In addition to this knowledge, they gained understanding and expertise in the used IDSs. By gathering this knowledge, the product got molded into a useful, extendable and efficient Dokterservice for the client. The process, for the development of the Dokterservice, was set up by means of the Scrum methodology. A change of requirements is a pristine reason to apply the scrum methodology, as development tasks are flexibly generated. Hence during the development process, any (un)expected changes could easily be adapted to the product and into Scrum.
The dokterservice that is created is able to read, process, merge and visualize the data gathered from the network traffic. Continously throughout the development, the product got tested on captured malicious network traffic, which varied tremendously in size. Furthermore, the product got tested by means of unit tests and continous integration. This ensured the team that the developed components work as they should. In addition to the aforementioned ways of testing, we also had continuous interaction with our stakeholders and organized demo’s for possible future endusers. The future outlook of the Dokterservice is very prosperous. KPMG plans on applying the Dokterservice for future network analysis. We also have provided some future work recommendations which are included in our final report.

[PDF]
[Abstract]
