Bridging the Gap

Open Call Results

Open Call - Funds Awarded

Visit to FET Flagship projects workshop to network with robotics flagship proposers

Travel fund, Jeremy Wyatt (Computer Science)

This project funds travel costs relating to a visit to the FET Flagship projects workshop in Brussels. It will allow the applicant to talk to coordinators of Robotics Flagship proposal, network with other possible collaborators, and assess the likelihood of funding, and demands of funders.

Key Performance Indicators of the Railway Industry

Fellowship, Andrew Tobias (Birmingham Centre for Railway Research & Education), with CERCIA

Immense effort is expended by the likes of the Office for Rail Regulation (ORR), Transport Statistics Great Britain (TSGB), Network Rail (NR) and the Association of Train Operating Companies (ATOC) in generating time series data for a large range of “key” indicators, but inconsistencies of basis and format (not to mention error) mean that making sense of them in the context of performance management is far from straightforward.

Each organization will tend periodically to cleanse its data (i.e. review and amendment where necessary) but in doing so their common aim is to arrive at the single “true” value of each measure for each period of interest and to throw away the others. Instead of this, how about recognizing the uncertainty associated with every single reported value and constructing confidence intervals within which we can expect the true values to lie? Decision-making, forecasting and policy development using such intervals might then all be expected to be more robust than when using what are, effectively, single point estimates.

An International Workshop on Network Inference

Workshop, Francesco Falciani (School of Biosciences), with various international computational and experimental biologists

We propose to organize an international workshop on integrative approaches to molecular network inference. The workshop is based on two components, which will be developed in the course of three days.

The first component is intended as a one-day workshop based on a series of talks from UK and international leaders in the area of computational network inference. We envisage contributions from both method developers, particularly in relation to optimization methods for model identification and parameter estimation, and computationally minded experimental biologists that use such methodologies. We expect about fifty workshop participants to this first part of the workshop.

The second component will involve a smaller group of selected participants and will last for two additional days. The participants will have the objective of defining a general workflow on how systems, network structures, and relevant biological pathways can be reverse engineered from genomics (including transcriptomic, proteomic, and metabolomic) data using appropriate computational approaches. This workflow will also describe how those reverse engineered models can then be validated through experimentation. During the second part of the workshop we also envisage that some of the participants will work in small groups at the development of a few proof of concept studies, aiming to demonstrate the applicability of some of the proposed methodologies to a state of the art dataset made available by the USA Army Corps of Engineers.

Mesoscopic Losses in Scale Free Networks: Optimizing the protocol for reducing network congestion

Feasibility Study, Igor Lerner (Physics and Astronomy), with Costas Constaniou (Electrical, Electronic and Computer Engineering)

We are addressing problem of losses in complex networks. The main motivation is to demonstrate that even in the regime when average losses in complex networks are small, dynamic fluctuations of losses in scale free networks (a good approximation for the Internet) become anomalously large and a description in terms of mean values becomes insufficient. When the moments of fluctuations are divergent one needs a full probability distribution function to present and adequate description and make reasonable conclusions.

Network is a complex, deterministic web. Nevertheless, we know that complex systems are successfully described by random models (random matrix theory for the description of complex nuclei is but one example). In such a description one assumes the existence of an ensemble of realizations of a given complex network (each realization differs by a particular distribution of nodes and links – i.e. servers and buffers in case of the Internet).

The specific of the communication network is the presence of two significantly distinct time scales. The shortest (microscopic) time is related to a discrete nature of packets inter-arrival times. The longest time is related to response of the link (a buffer that accumulates packets and regulates traffic flow between the nodes – different servers) to a smooth change of traffic. The observation time (the time of propagation of a signal through the network) is “mesoscopic” that is in between the shortest and the longest scale. Since losses occur during the propagation of a signal they happen at mesoscopic time scale. We know (from condensed matter problems) that at the mesoscopic scale fluctuations become of the same order as average. We have already shown that mesoscopic fluctuations in a scale free network with power law correlations will become anomalously large. However, the existing protocols tend to over-react to such a fluctuational increase , closing the affected routes (even when their fraction is small), which can lead to an avalanche of losses.

Towards an agent-based modelling framework based on omics data for the investigation of cancer development

Fellowship, Shan He (Computer Science), with Mark Viant (Biosciences) and Chris Buckley and Douglas Ward (Medical and Dental Sciences)

Recently, it has been hypothesised that immune systems and cancer cell populations are self- organising (SO) systems and the underlying principle of their collective behaviour is similar to the SO collective behaviour in animals. The interaction between immune systems and cancer cells, which can be seen as predator-prey interaction, is at the root of the development of cancer. Another relatively recent development in the biological sciences has been the application of omics techniques to understand individual cell’s behaviour and their microenvironment, which is crucial for the understanding of the SO multi-cellular systems.

Agent-based modelling (ABM) has been proved to be a powerful tool to model SO systems. For example, the applicant Dr Shan He has successfully applied ABM to simulate the evolution of animal’s SO aggregation behaviour. However, the value of ABM has not been fully recognised by the researchers in biomedical sciences to study SO multi-cellular systems. On the other hand, novel data mining algorithms are urgently needed to discover biologically meaningful knowledge from high-dimensional and heterogeneous omics experimental data to understand SO multi-cellular systems. To address these problems, the research proposal I am going to develop aims to develop an automatic computational framework for the modelling of SO multi- cellular systems based on data mining, agent-based modelling and computational intelligence techniques. The framework will be applied to model the interactions between immune systems, cancer cell and their microenvironment to understand the development of cancer.

Developing research collaboration project with Ningbo University in China

Travel Fund, Long-yuan Li (Civil Engineering)

This grant funds a visit to the School of Engineering at Ningbo University. It will allow the applicant to discuss research collaboration in the field of structural design optimization by including the material degradation due to various environmental actions such as chemical actions. This proposal is novel as existing structural optimization considers mechanical loadings only. We will also discuss various potential funding sources for the support of the collaboration project and prepare research proposals accordingly.

Motor recovery after stroke by robot therapist assistant

Feasibility Study, Alan Wing (Psychology) and with Jeremy Wyatt (Computer Science)

We will use the BTG for funding 4-month full-time salary for a research fellow to conduct a pilot study and to prepare and write a research proposal with Wing and Wyatt on techniques for gathering and analysis of data from patients with visual-motor dysfunction due to stroke. This work includes

  1. A literature search about classification of motor impairments, prognosis in stroke patients, rehabilitation approaches and their outcomes as well as relevant literature about motor learning and plasticity.
  2. Developing the proposal (with the PIs).
  3. Collecting pilot data from one stroke patient who shows visual-motor impairment and one healthy control participant with intact motor function.
  4. Organizing and writing the major parts of a grant application with behavioral and imaging studies of motor learning in stroke hemparesis.

Dielectric Resonators as Novel Receive Elements in MRI & NMR

Feasibility Study, Paul Smith (Electronic, Electrical & Computer Engineering), with Dr Leppinen and Dr Stephen (Mathematics)

This proposal is essentially to use a Dielectric Resonator (DR) as the receive element in two systems, a Nuclear Magnetic Resonance (NMR) and Magnetic Resonance Imaging (MRI) system. In an NMR and MRI system the Signal to Noise (S/N) ratio is given by n*√Q, where n is the magnetic filling factor and Q is the loaded Q. In a DR the Q may be an order of magnitude larger than that of a conventional receive coil, the magnetic filling factor can also be substantially larger. Overall this could potentially give an improvement in S/N of about an order of magnitude. Some simple results were shown at the LES/EPS away day, showing a DR with a Q about 5X larger than conventional MRI receive coils.

However there are several issues about the feasibility of DR’s as there are a number of key technical challenges to be overcome. One such example would be how to adjust, or tune, the resonant frequency of the DR. Another is whether the Radio Frequency (RF) magnetic field can be made sufficiently uniform inside the DR. This latter point is very important in NMR and fairly important in MRI systems. Another key point is the optimization of the design of the DR, to achieve optimum performance, and also to data mine the resulting information that comes out of the project. Dr Smith and Dr Stephen have supervised projects before on modelling of Dielectric Resonators, which will be helpful in the preliminary modelling required for this project. At the moment Dr Smith, Leppinen and Stephen have a joint PhD student modeling complex high frequency devices.

Profiling Learner Strategies

Fellowship, Julie Christian (Psychology) with Theodoros N. Arvanitis (Electronic, Electrical & Computer Engineering)

While considerable research interest has centred on how learning facilitates performance in demanding cognitive tasks, an area largely overlooked in the neuroscience literature has been that of individual differences that exist between learners. More specifically, studies have not taken into account the underlying psychological characteristics (i.e., learning strategies: surface, strategic) that guide individuals’ choices, motivations, and effectiveness of learning. Yet, within the field of social and personality psychology there is a well-established literature that could aid in profiling both the types of learning strategies people employ, as well as to help enable us to identify these motivations.

We wish to use our own expertise to begin addressing the issues: Christian (School of Psychology) is an expert on social psychological theories and measurement. Arvanitis (School of Electronic, Electrical & Computer Engineering) possesses the track record and expertise needed to develop intelligent software agents and data mining tools, in particularly for brain imaging and spectroscopy data. Together, we are in a unique position to link developments in machine learning technology and innovations in theory and measurement from applied psychology. In this proposal, we take a critical step towards filling this gap by designing innovative tools needed to tap key concepts surrounding learning, and to leverage into successful grant applications.

There are 3 phases of research required to test our central question, ‘What effect do individual differences – learning strategies – have on learning and predicting the accuracy of learning over time?’. In Phase 1, fine-tune the survey measures we have piloted, ensuring they are ecologically valid for data collection. Phase 2, cognitive, behavioural, and imaging data will also be collected from the participants (those completing the finalized survey measure). Tasks will include: 1. Tasks that will measure cognitive ability: visual sort term memory, spatial and cognitive inhibition, and attention tasks; 2. Training on perceptual discrimination tasks (i.e. discrimination of visual patterns in background noise). Imaging data will include: functional imaging brain data collected before and after training on the perceptual discrimination task. Presently, we have not yet linked survey findings and cognitive outcomes to possible differences in patterns of activation; this would be a principle task for Phase 2 of the programme. In Phase 3, we propose to build a data mining tool that would first profile learners’ characteristics on the basis of the cognitive and psycho-social variables, and then subsequently use data mining to explore the profiles and predict accuracy of classification over time. By comparing the effects of individuals using different learning strategies (surface, strategic), we will discover which method exerts a more positive impact on learning, and if this pattern is robust longitudinally.

In particular, using machine learning approaches, we will test whether the models extracted for learners with different profiles are typical and representative of their learning patterns. We will use independent data sets to predict from behavioural data whether an observer’s brain patterns relate to the ‘good’ or ‘weak’ learners’ model. For example, it is possible that only ‘good’ rather than ‘weak’ learners will show changes in the neural sensitivity for stimulus features critical for perceptual decisions. In a complementary analysis, we will evaluate whether we can predict behavioural improvsystems-level analysis and modellingement due to learning for any individual observer based on whether the observer’s brain activations correlate with brain patterns for the ‘good’ or ‘weak’ learners.

Examining the effect of using prior information in Bayesian analysis—an application to environmental valuations

Feasibility Study, Hui Li (Mathematics), with Matthew Cole and David Maddison (Economics)

This research project falls in the theme of advanced data analysis. It is crucial to enhance the efficiency, i.e., validity and accuracy, of estimates in statistics and economics. While the Bayesian method has been widely used in economics and other social sciences, it is not clear whether incorporating the prior information of the parameters would enhance the efficiency. In this project, we are trying to develop a statistical model using Bayesian method with an appropriate implementation of prior information, and apply it to environmental valuations, then examine whether using the prior information would improve the efficiency of analysis.

In essence, the aim of the project is to bring a new application using a well-established method. Specially, we are trying to develop a statistical model using Bayesian method to examine the effect of the prior distribution (or belief) in contrast with little or no prior information. The scientific agenda includes:

  • endeavoring to develop a multivariate statistical model using Bayesian method via Matlab and then converting it to Stata (software widely used in Economics)
  • using the model to begin to study the effect of inclusion of prior information using simulated data (the initial values of the parameters can be obtained from a focus group)
  • bringing together statistical and economic considerations from many sources to evaluate and improve the accuracy of the models
  • managing the database for the environmental valuation and data mining in selecting sensible and reasonable social-economic variables. Application to the economic dataset is not as straightforward as to the simulated data, because the variables will cover socioeconomic status and political influence indicators, and thus data need to be merged from several sources.
  • applying the model to environmental economics on the environmental valuation data
  • comparing the models with / without prior information and examining the effects of prior beliefs
A logical approach to heterogeneous data

Travel, Achim Jung (Computer Science)

It is well known that in real life situations one has to deal with conflicting and incomplete information and most of the time, humans know how to work their way around the problems that arise from this. Mathematically, conflicting information leads to a complete breakdown of classical logic, as from a contradiction anything can be concluded. There is a long established tradition in philosophy that attempts to formalise everyday reasoning in such a way that conflict and missing information can be accommodated. A major piece of work in that direction was done by N. D. Belnap in 1977, where he proposed a certain four-valued logic. Follow-on work from there was done by philosophers on the one hand and logic programmers on the other.

The issue described above is not only apparent in everyday reasoning but also very much present in computer science, when one is dealing with information stored in different and heterogeneous databases, and when one is trying to pull them together for data mining and query processing purposes. Databases have a logical description (the connection between relational databases and first order logic is well known) and so it seems a sensible and promising approach to attempt to employ Belnap's formalism in the area of analsysis of heterogeneous data.

With funding from this application I would like to bring a philosopher (Dr Umberto Riviecchio) to Birmingham for a period of three months. Dr Riviecchio has recently completed a PhD on four-valued logic in Barcelona and is now one of the world's foremost experts in this particular strand of philosophical logic. I would like to make use of his expertise in the more concrete setting of querying heterogeneous data and particular his knowledge of deductive systems.

Feasibility of using Data Mining in the Signaling of Trains through Railway Stations and Junctions BTG away day 2 project

Fellowship, Andrew Tobias (Birmingham Centre for Railway Research & Education), with Behzad Bordbar (Computer Science)

Among the successful applications of Data Mining has been the use by BT of Process Mining (extracting information from event logs) in service-oriented telecommunication systems with the involvement of Dr. Bordbar. The study proposed here will involve assessing the feasibility of applying similar approaches to the sorts of decision-making and rescheduling required of railway signalers as trains approach and depart from junctions and stations either early, on-time or late.

Decoding the neural computations for depth perception in the human brain

Fellowship, Andrew Welchman (Psychology), with Peter Tino (Computer Science) and Ali Chowdhury (Imaging)

Recognising and interacting with objects depends on knowing the three-dimensional (3D) structure of the world around us. Despite considerable interest in this problem, the neural circuits that support 3D perception are not understood, and artificial systems struggle. Recent advances in human brain imaging have allowed us to identify brain areas involved in the processing of binocular (disparity) and monocular (perspective, motion parallax) cues to depth structure. However, the conventional approaches for the analysis of brain imaging data lack the sensitivity necessary for decoding fine-grained selectivity for depth signals. Therefore our understanding of the neural computations that support the human ability for 3D perception remains limited.

The proposed research aims to answer this question combining high-resolution imaging and computational methods for data mining to gain fine-grained understanding of how depth is represented. I will do this by working with physicists (A. Chowdhury, Birmingham Univ Imaging Centre) for high-resolution fMRI sequence development, machine learning experts (P. Tino in Computer Science).

Specifically, machine learning approaches provide a principled way of isolating informative signals from noisy and complex data that are diagnostic of differences between stimulus conditions (e.g. near vs far surfaces). The logic of the approach is that voxels reflect a biased sample of underlying neuronal selectivities (i.e., the sample of disparity columns is not identical in each voxel). Thus, presenting different types of stimuli results in a dissociable pattern across the population of voxels. Machine learning algorithms (e.g. support vector machines) are able to exploit these voxel preferences that might otherwise be too weak to extract using conventional (univariate) statistical analysis. This will allow me to test whether there is fine- scale functional clustering of selectivities at the level of small sub-regions within individual cortical areas. An important step in this analysis will be to determine which voxels are plotted back to the cortical surface (i.e. voxel selection issue). I will explore different data mining methods based on the reliability of classification (i.e. cross-validation data), SVM weights, and mutual information between fMRI activations for each voxel and SVM weights51. I will explore methods for quantifying topologies based on wavelet decomposition and Bayesian selection of Gaussian mixture models to describe underlying disparity selectivities. The expected outcome is topological maps of depth that will allow me to assess differences in depth representation across cortical areas.

Systems-level Analysis and Modelling for Cancer and Inflammation

Workshop, Jon Rowe (Computer Science) on behalf of Systems Science for Health

We propose two workshops: one based around Inflammation and the other around Cancer. These would be one-day events taking place in February and March 2011. The primary aim of the workshops is simply to assist in the integration of new members of staff with the existing SSfH team. A second important aim is to begin the collaborative process by trying to identify problems associated with the particular diseases (Inflammation and Cancer) for which computational and mathematical modelling would be valuable. We expect, therefore, that the workshops will initially be driven by people from the Medical School explaining the problems they are interested in solving. The Bioscientists would contribute in terms of describing the kinds of data that can be collected concerning these systems and how the analysis of such data can contribute to our understanding of the underlying biological systems. The Computer Scientists and Mathematicians would then be able to propose modelling and analysis techniques to exploit the data and develop predictive models.

Modeling and Proving Properties of Complex Systems

Fellowship, Manfred Kerber (Computer Science), with Colin Rowat (Economics)

An important tool to model complex economic systems is game theory as introduced by von Neumann and Morgenstern. The games come in different forms. Abstract games in this tradition are defined by an irreflexive dominance operator. Economists typically endow this operator with additional properties to obtain an economically meaningful model, and then study its properties. Of particular interest are solution concepts such as the so-called `stable set'. This process is a very expensive and occasionally also error-prone. It is expensive since the properties have to be found and proved by hand. While certain properties can be reused from related games, the checking has to be performed by hand and many properties require an adjustment of the arguments. It is occasionally error-prone, e.g., it took almost a quarter century to provide a counterexample to von Neumann and Morgenstern’s assumption that stable sets always existed. These problems occur since working with a new axiom system means that no one has a good intuition yet (there are more recent examples of this type).

The scientific agenda is to support formalizations of complex systems by games by systems developed in the MKM (mathematical knowledge management) community and to support proving the theorems in the field by a tactical theorem prover which allows to reuse tactics in new situations. In order to do this we will have to identify and adjust a suitable system. Coq, Isabelle, Theorema, and Omega are some candidates, with Isabelle currently the front runner. A small case study would be conducted to gather insights into which system is best suited to support our intended application area. We will seek advice from Prof Kohlhase from the Jacobs University how best to represent the knowledge. The applicant will visit Prof Kohlhase in November to discuss details. Furthermore a visit to the chief developer of the Theorema project in Linz, Dr Wolfgang Windsteiger is planned.

Mining the Social Unconscious BTG away day 2 project

Feasibility Study, Keith Challis (Vista Centre, Institute of Archaeology & Antiquity) and Mourad Oussalah (Electronic, Electrical & Computer Engineering), with Thorsten Schnier (Computer Science)

On-line social networks provide a means for rapid communication between widely dispersed networks of people. The information generated by users of these networks provides an unfiltered snapshot of social mores and opinions at any point in time and is a hugely significant resource for those interested in human behavior. Amongst social networking media the micro-blogging tool twitter is particularly noteworthy for functioning in near real-time and providing information on the spatial location of individuals at the point of posting. It is thus potentially possible to reconstruct both social and spatial patterns of behaviors from an archive of users’ postings. This research aims to develop and test a methodology (comprising coding tools and analytical techniques) to examine the unconsciously expressed preferences and opinions of individuals communicating using Twitter (www.twitter.com). The work will have a number of components:

  1. Development of tools to harvest geo-located tweets, based on harvesting the live tweet feed by manipulating the twitter api using a suitable toolkit.
  2. Harvesting a significant volume of geolocated tweets over an extended period (approximately four weeks continuous monitoring)
  3. Development of tools and methods for analyzing the spatial and textual content of the harvested tweets to extract relevant information.
  4. Drafting of a feasibility report and further grant applications.

Toward Semantic Automatic Text Summarization BTG away day 2 project

Feasibility Study, Mourad Oussalah (Electronic, Electrical & Computer Engineering) and Oliver Mason (English)

This project aims to design, implement and test a new method for improved automated text summarization using an algorithm developed at School of English to provide an assessment of the naturalness of the resulting output.The project will build on existing work at the School of Electronics, Electrical and Computer Engineering on Automatic Text Summarization and enhance the proposal with linguistic processing and analysis of grammatical structure.

Automatic Text Summarization, although a hot research topic with a promising future, is recognized as a hard problem of natural language processing due to the difficulty of capturing the context and semantic aspects of sentences. This requires semantic analysis, discourse processing and inferential interpretation, among others. This project builds on Automatic Summarization work carried out at Electronics, Electrical and Computer Engineering using WordNet and a taxonomy derived from Wikipedia as a basis to compute the semantic similarity between sentences, and then additional metrics involving the redundancy, diversity, user’s profile, among others to capture the most relevant sentences that will be displayed in the summarizer. Nevertheless, the information contained in the grammatical structure is not exploited in the current work. In order to account for this, we aim in this project to extend the current system by taking into account the cohesive structure of the text as derived through a linguistic analysis, and by employing a technique to assess the naturalness of sentences that has been developed at the English department of the University of Birmingham. Throughout this project we anticipate:

  • A better handling of the semantic aspect of the sentences, especially, when semantic similarity between sentences is concerned.
  • A significant improvement of the Automatic Text Summarizer that approaches abstractive like summarization method rather than commonly used extractive summarization.
  • A sound metric to quantify the naturalness of sentence/phrase/document, which will be very useful for educational purposes for instance.
  • A better interaction with digital and librarian communities for further enhancement and personalized development.

Agent-based modelling of Theory of Mind (ToM)

Feasibility Study, Dietmar Heinke, School of Psychology, and Nick Hawes and Aaron Sloman (Computer Science)

Theory of mind (ToM) is the ability of an agent (human or animal) to attribute mental states, e.g. beliefs, intents, etc., to itself and other agents and, crucially, to understand that other agents have mental states that are different from its own. It is commonly assumed that Theory of Mind (ToM) is a crucial mechanism for our everyday social behaviour, as social interactions are assumed to require us to reason about one another’s mental states in a conceptually sophisticated way. However, a rapidly emerging literature (driven, in part by research by Apperly and colleagues) suggests that such reasoning is rather slow, demanding and error-prone, which are exactly the wrong characteristics for guiding on-line social interaction (Apperly, Samson & Humphreys, 2009). This has led to the recent theoretical proposal that humans have “two systems” for interacting with others: one which requires effortful reasoning, and another that makes use of more simple heuristics, e.g. visual perspectives taking (Apperly & Butterfill, 2009; see Samson, Apperly et al. 2010 for experimental evidence). The proposed research will implement such simple heuristics in an agent-based model (ABM), in order to test whether they are sufficient to generate simple social behaviours.

Results from other rounds: