1 

Dynamic Connectivity Metric for Routing in VANETs
In vehicular adhoc networks a path has to be found to send a message from one vehicle to another vehicle. This path has to have a connectivity rate that is high enough to obtain a high probability of arrival of the message. In other approaches, this path is based of current, static information of the distribution of cars. In this paper we propose a dynamic, probalilistic approach, where we estimate the connection probability assuming Poisson Processes on all roads in the network, based on the current traffic density. The analytic derivation of the connection probability is compared to values derived by simulation.

[Abstract]

2 

Reply to comment by A. Tommasi and D. Mainprice on Visser et al. (2008), "Probability of radial anisotropy in the deep mantle" [Earth Planet Sci. Lett. 270 (2008) 241250]


3 

An algorithm for the fusion of two sets of (sonar) data
This paper presents an algorithm that can be used to fuse two sets of contacts, e.g. as observed by two active sonar systems. The goal is to keep a good probability of detection, without doubling the number of false alarms after combining both data sets. We present both the theoretical motivation of the algorithm, as well as simulations that illustrate its performance.

[Abstract]

4 

Rangedependent sonar performance modelling during Battlespace Preparation 2007
Spatial and temporal variations in sound speed can have substantial effects on sound propagation and hence sonar performance. Operational oceanographic models can provide forecasts of oceanographic variables as temperature, salinity and sound speed up to several days ahead. These fourdimensional forecasts can be used as input for a sonar performance prediction model. This paper presents results of such coupled ocean and sonar performance modelling, carried out during the Battlespace Preparation 2007 sea trial. The new aspect of the work is that the complete modelling chain from ocean forecast to probability of detection plots was carried out in near realtime at sea, aboard HNLMS Snellius (Figure 1).

[Abstract]

5 

Structural similarity determines search time and detection probability
The recently introduced TSSIM clutter metric is currently the best predictor of human visual search performance for natural images (Chang and Zhang [1]). The TSSIM quantifies the similarity of a target to its background in terms luminance, contrast and structure. It correlates stronger with experimental mean search times and detection probabilities than other clutter metrics (Chang and Zhang [1,2]). Here we show that it is predominantly the structural similarity component of the TSSIM which determines human visual search performance, whereas the luminance and contrast components of the TSSIM show no relation with human performance. This result agrees with previous reports that human observers mainly rely on structural features to recognize image content. Since the structural similarity component of the TSSIM is equivalent to a matched filter, it appears that matched filtering predicts human visual performance when searching for a known target. © 2010 Elsevier B.V. All rights reserved.

[Abstract]

6 

Probability of flooding: An uncertainty analysis
In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could indicate whether we are ready for the change to a new safety concept. Furthermore, the uncertainty analysis may give insight to the prioritization of future research to reduce uncertainties. To find out whether the uncertainties are reducible by means of future research, different types of uncertainty have to be distinguished. The analysis is done with both FORM and the MonteCarlo method. The input for the models will be determined with the aid of expert opinions. For the most important failure mechanism overflow and overtopping, the current results indicate that the most influential stochastic variables are the ones with inherent uncertainty, which uncertainty is not reducible

[Abstract]

7 

SpeedUp of the Monte Carlo Method by Using a Physical Model of the DempsterShafer Theory
By using the Monte Carlo method, we can obtain the minimum value of a function V(r) that is generally associated with the potential energy. In this paper we present a method that makes it possible to speed up the classical Monte Carlo method. The new method is based on the observation that the Bolzmann transition probability and the concept of local thermodynamical equilibrium give rise to an initial state of maximum entropy, which is subsequently modified by using the information on the internal structure of the system. The classical thermodynamic model does not take into account any structures inside the system, and therefore in many cases does not accurately model the system itself. In an attempt to take into account the internal structure of the system, we propose a physical model of the belief measure as defined in the DempsterShafer theory. The recent discovery by Resconi of an algorithm to calculate the probability distribution that has previously been developed by Harmanec and Klir, and which is consistent with the belief measure, opens the way to utilizing the Bolzmann distribution not only with a uniform distribution of probability, but instead with an arbitrary distribution of probability to guide the Monte Carlo iterative method to obtain the global minimum value of the potential energy. Starting from local thermodynamic equilibrium (i.e., local symmetry), the algorithm computes a new distribution over subsystems, resulting in a nonuniform distribution and in symmetry breaking. In the general case one can start with a different initial distribution induced by other local symmetries, corresponding to specific differential equations (e.g., the FokkerPlanck equation) and calculate from this the global distribution corresponding to the breaking of local symmetry. © 1998 John Wiley & Sons, Inc.

[Abstract]

8 

Probabilistic Classification between Foreground Objects and Background
Tracking of deformable objects like humans is a basic operation in many surveillance applications. Objects are detected as they enter the field of view of the camera and they are then tracked during the time they are visible. A problem with tracking deformable objects is that the shape of the object should be reestimated for each frame. We propose a probabilistic framework combining object detection, tracking and shape deformation. We make use of the probabilities that a pixel belongs to the background, a new object or any of the known objects. Instead of using arbitrary thresholds for deciding to which class the pixel should be assigned we assign the pixel based on the Bayes criterion. Preliminary experiments show the classification error drops to about half the error of traditional approaches.

[Abstract]

9 

Alternative measurement techniques for infrared sensor performance
This paper deals with measurement and characterization of IR sensor performance. Alternative methods have been investigated in order to determine the performance more reproducibly and accurately. Furthermore, the new methodologies reduce the subjectivity of the performance parameters such as the standard minimum resolvable temperature difference (MRTD) and tend to reduce the time (and thus cost) of measurement. The three methods described concern the use of (1) and IR zoom collimator, (2) a test device with line targets, and (3) an extended target combined with a TV camera instead of a human observer and a doubleline source, resolving the ambiguity beyond the Nyquist frequency in staring focal plane array cameras. In the IR zoom collimator a variety of targets are used in four different orientations; the observer has to report not only the recognition, but also the orientation of the target. Instead of variation of the temperature difference of the target, its range is varied by zooming in and out. In the rapid objective MRTD tester, the performance of the eye is simulated with a simple formula. The complete MRTD is obtained in about two minutes. In the doubleslit method the best phase of the detector matrix is taken in comparison with the two line sources; instead of the MRTD curve, a new type of performance curve is measured: The minimum resolvable intensity difference (MRID), from which the range performance is obtained similarly to the classical MRTD method. The paper contains examples of measurements with each of the methods. ©2003 Society of PhotoOptical Instrumentation Engineers.

[Abstract]

10 

Length effects in reliability analysis of flood protection systems
The defence against flooding by storm surges and large river discharges are provided by complex systems of dikes, dunes, retaining walls, higher grounds, barriers, locks and so on. Spatial correlations inside and between the various components play an important role in the reliability and risk assessment of the system. The correlation may be present on the loading terms as well as on the resistance side. In The Netherlands computer programs have been developed to deal with these correlation effects. The paper will describe both the background modelling and calculation procedures of these computer codes and discuss a number of calculation results.

[Abstract]

11 

Do cortical neurons process luminance or contrast to encode surface properties?


12 

A modeling approach to assess the effectiveness of BLEVE prevention measures
Distribution and storage of liquefied pressurised gases is a critical safety issue, often resulting in a very high individual and societal risk, at least in densely populated zones. Several risk assessments pointed out that this is mostly due to the possible occurrence of a Boiling Liquid Expanding Vapour Explosions (BLEVE) of the tank that may be exposed to an intense fire as a consequence of release scenarios. In the present work, a modelling approach is presented for the calculation of fired BLEVE probability. Simplified models were obtained for the estimation of the vessel time to failure with respect to the radiation intensity on the vessel shell. The time to failure was compared to a reference time required for effective mitigation actions and thus related to the escalation probability. The analysis of the effect of protective measures, such thermal insulating coating and pressure relief devices, on the time to failure allowed the identification of the necessary requirements for effective BLEVE prevention.

[Abstract]

13 

Delay analysis and optimization of bandwidth request under unicast polling in IEEE 802.16e over gilbertelliot error channel
In the centralized polling mode in IEEE 802.16e, a base station (BS) polls mobile stations (MSs) for bandwidth reservation in one of three polling modes; unicast, multicast, or broadcast pollings. In unicast polling, the BS polls each individual MS to allow to transmit a bandwidth request packet. This paper presents an analytical model for the unicast polling of bandwidth request in IEEE 802.16e networks over GilbertElliot error channel. We derive the probability distribution for the delay of bandwidth requests due to wireless transmission errors and find the loss probability of request packets due to finite retransmission attempts. By using the delay distribution and the loss probability, we optimize the number of polling slots within a frame and the maximum retransmission number while satisfying QoS on the total loss probability which combines two losses: packet loss due to the excess of maximum retransmission and delay outage loss due to the maximum tolerable delay bound. In addition, we obtain the utilization of polling slots, which is defined as the ratio of the number of polling slots used for the MS's successful transmission to the total number of polling slots used by the MS over a long run time. Analysis results are shown to well match with simulation results. Numerical results give examples of the optimal number of polling slots within a frame and the optimal maximum retransmission number depending on delay bounds, the number of MSs and the channel conditions. Copyright © 2009 The Institute of Electronics, Information and Communication Engineers.

[Abstract]

14 

Geostatistical modelling of the association between malaria and child growth in Africa
Background: Undernutrition among children under 5 years of age continues to be a public health challenge in many low and middleincome countries and can lead to growth stunting. Infectious diseases may also affect child growth, however their actual impact on the latter can be difficult to quantify. In this paper, we analyse data from 20 Demographic and Health Surveys (DHS) conducted in 13 African countries to investigate the relationship between malaria and stunting. Our objective is to make inference on the association between malaria incidence during the first year of life and heightforage Zscores (HAZs). Methods: We develop a geostatistical model for HAZs as a function of both measured and unmeasured childspecific and spatial risk factors. We visualize stunting risk in each of the 20 analysed surveys by mapping the predictive probability that HAZ is below  2. Finally, we carry out a metaanalysis by modelling the estimated effects of malaria incidence on HAZ from each DHS as a linear regression on national development indicators from the World Bank. Results: A nonspatial univariate linear regression of HAZ on malaria incidence showed a negative association in 18 out of 20 surveys. However, after adjusting for spatial risk factors and controlling for confounding effects, we found a weaker association between HAZ and malaria, with a mix of positive and negative estimates, of which 3 out of 20 are significantly different from zero at the conventional 5% level. The metaanalysis showed that this variation in the estimated effect of malaria incidence on HAZ is significantly associated with the amount of arable land. Conclusion: Confounding effects on the association between malaria and stunting vary both by country and over time. Geostatistical analysis provides a useful framework that allows to account for unmeasured spatial confounders. Establishing whether the association between malaria and stunting is causal would require longitudinal followup data on individual children.

[PDF]
[Abstract]

15 

Security Games with Probabilistic Constraints on the Agent’s Strategy
This paper considers a special case of security games dealing with the protection of a large area divided in multiple cells for a given planning period. An intruder decides on which cell to attack and an agent selects a patrol route visiting multiple cells from a finite set of patrol routes such that some given operational conditions on the agent’s mobility are met. For example, the agent might be required to patrol some cells more often than others. In order to determine strategies for the agent that deal with these conditions and remain unpredictable for the intruder, this problem is modeled as a twoplayer zerosum game with probabilistic constraints such that the operational conditions are met with high probability.We also introduce a variant of the basic constrained security game in which the payoff matrices change over time, to allow for the payoff that may change during the planning period.

[Abstract]

16 

Design and analysis of compressed sensing radar detectors
We consider the problem of target detection from a set of Compressed Sensing (CS) radar measurements corrupted by additive white Gaussian noise. We propose two novel architectures and compare their performance by means of Receiver Operating Characteristic (ROC) curves. Using asymptotic arguments and the Complex Approximate Message Passing (CAMP) algorithm, we characterize the statistics of the l1norm reconstruction error and derive closed form expressions for both the detection and false alarm probabilities of both schemes. Of the two architectures, we demonstrate that the best performing one consists of a reconstruction stage based on CAMP followed by a detector. This architecture, which outperforms the l1based detector in the ideal case of known background noise, can also be made fully adaptive by combining it with a conventional Constant False Alarm Rate (CFAR) processor. Using the state evolution framework of CAMP, we also derive Signal to Noise Ratio (SNR) maps that, together with the ROC curves, can be used to design a CSbased CFAR radar detector. Our theoretical findings are confirmed by means of both Monte Carlo simulations and experimental results. © 2012 IEEE.

[Abstract]

17 

Radio access selection in multiradio access systems
Future wireless access systems will be characterized by their heterogeneity from technological point of view. It is envisaged that in certain areas endusers will have a choice between various radio accesses (RAs) such as e.g. classical cellular networks (GSM, UMTS, WiMAX, etc), WLAN hotspots, or over multiple hops via wireless adhoc networks. In this scenario it is crucial to select the most appropriate RA for a user at a given location requesting a particular service. This crucial task is performed by the radio access selection (RAS) algorithm. In this paper we evaluate four different RAS algorithms by simulations of a multiradio access (MRA) system that consists of a cellular UMTS network and a centrally placed WLAN hotspot. The evaluation compares the performance of the MRA system with respect to the blocking probability, resource utilization, average user throughput and the 90th user throughput percentile. The simulation results show that combining the resource pools and adding a radio resource consumption or throughput based criteria in the access selection results in almost two times higher supportable traffic load for a blocking probability target of 2%. The average data throughput is increased between 25% and 100%, depending on the traffic load, while the 90% data throughput is increased up to six times. ©2007 IEEE.

[Abstract]

18 

The search for an alerted moving target; 2005BU2OA
We investigate a twosided, multistage search problem where a continuous search effort is made by one or more search units to detect a moving target in a continuous target space, under noisy detection conditions. A specific example of this problem is hunting for an enemy submarine by naval forces. So far, this problem has not been solved, because of the difficulty of predicting the target's behaviour. In finding promising routes for the search units, a heuristic has been developed. To obtain these routes, at every decision moment in time an optimal point to go to must be determined. This amounts to finding at every decision moment an optimum of a function that changes over time. © 2005 Operational Research Society Ltd. All rights reserved.

[Abstract]

19 

Assessment factors for human health risk assessment: A discussion paper
The general goal of this discussion paper is to contribute toward the further harmonization of human health risk assessment. It first discusses the development of a formal, harmonized set of assessment factors. The status quo with regard to assessment factors is reviewed, that is, the type of factors to be identified, the range of values assigned, as well as the presence or absence of a scientific basis for these values. Options are presented for a set of default values and probabilistic distributions for assessment factors based on the state of the art. Methods of combining default values or probabilistic distributions of assessment factors are also described. Second, the effect parameter, the noobservedadverseeffect level (NOAEL), is discussed. This NOAEL as selected from the toxicological database may be a poor substitute for the unknown, true noadverseeffect level (NAEL). New developments are presented with respect to the estimation of the NAEL. The already widely discussed Benchmark Dose concept can be extended to obtain an uncertainty distribution of the Critical Effect Dose (CED). This CED distribution can be combined with estimated uncertainty distributions for assessment factors. In this way the full distribution of the Human Limit Value will be derived and not only a point estimate, whereas information on doseresponse relations is taken into account. Finally, a strategy is proposed for implementation of the new developments into human health risk assessments.

[Abstract]

20 

The influence of adverse weather conditions on probability of congestion on dutch motorways
Weather conditions are widely acknowledged to contribute to the occurrence of congestion on motorway traffic by influencing both traffic supply and traffic demand. To the best of our knowledge, this is the first paper that explicitly integrates supply and demand effects in predicting the influence of adverse weather conditions on the probability of occurrence of congestion. Traffic demand is examined by conducting a stated adaptation experiment, in which changes in travel choices are observed under adverse weather scenarios. Based on these choices, a Panel Mixed Logit model is estimated. Supply effects are taken into account by examining the influence of precipitation on motorway capacity. Based on the Product Limit Method, capacity distribution functions are estimated for dry weather, light rain and heavy rain. With the developed model to integrate the supply and demand effects breakdown probabilities can be calculated for any given traffic demand and capacity. The results show that rainfall leads to a significant increase in the probability of traffic breakdown at bottleneck locations. Interestingly the probability of a breakdown at these bottleneck locations is predicted to be slightly higher in light rain (98.7%) than in heavy rain (95.7%) conditions, which is the result of the higher traffic demand in light rain conditions. Based on the results presented in this paper, it can be recommended to always incorporate both supply and demand effects in the predictions of motorway breakdown probabilities due to adverse weather conditions to improve the validity of the predictions. © 2015 Editorial Board EJTIR. All rights reserved.

[Abstract]
