Recent advances in smart devices have made seamless monitoring of an individual’s gait in ecological settings possible. Several applications are envisioned that bring clinicians and technologists to work together in monitoring health through gait in a large ecological scale. With proven viability that technology can aid in pervasive healthcare, it is still very challenging to establish a common notion of ’trust’ between technologists and therapists due to factors such as adherence of technology to ground truth, and the like. To mitigate such fundamental challenges, in this dissertation, two works are presented that have been devised both by technologist and therapists right from the inception. The works include: 1) a wearable-cloud (WC) framework, and 2) a smartphone-cloud (SC) framework. The goal of this dissertation is to portray the methodologies required to establish the common ’trust’ between technologists and therapists. WC-framework provides a framework for Stroke affected individuals and therapists to remotely connect with one another. The SC-framework uses vision-based advancements to aid individuals to self-monitor their gait using a smartphone camera, with a special emphasis in monitoring a Parkinsonian Gait.
Parkinsonian gait is associated with life-threatening consequences such as fall risk in Parkinson patients. Conventional Parkinsonian gait analysis heavily relies on expensive sensors and human labor. In this work, we propose a sensor-free end-to-end system which enables the automated and accurate Parkinsonian gait detection and analysis upon the videos recorded by pervasive cameras. Specifically, we leverage Deep Learning technologies to extract the human skeleton in the video frame and address the camera random angle challenge. By analyzing the gait features, we train a classifier based on a binary decision tree. Out of 16 Parkinsonian gait and 13 healthy gait videos, our system is able to detect the Parkinsonian Gait with 93.75% accuracy and healthy gait with 100% accuracy.
As the next-generation manufacturing driven force, 3D printing technology is having a transformative effect on various industrial domains and has been widely applied in a broad spectrum of applications. It also progresses towards other versatile fields with portable battery-powered 3D printers working on a limited energy budget. While reducing manufacturing energy is an essential challenge in industrial sustainability and national economics, this growing trend motivates us to explore the energy consumption of the 3D printer for the purpose of energy efficiency. To this end, we perform an in-depth analysis of energy consumption in commercial, off-the-shelf 3D printers from an instruction-level perspective. We build an instruction-level energy model and an energy profiler to analyze the energy cost during the fabrication process. From the insights obtained by the energy profiler, we propose and implement a cross-layer energy optimization solution, called 3DGates, which spans the instruction-set, the compiler and the firmware. We evaluate 3DGates over 338 benchmarks on a 3D printer and achieve an overall energy reduction of 25%.
Long-term rehabilitation opportunities are critical for millions of individuals with chronic upper limb motor deficits striving to improve their motor performance through self-managed rehabilitation programs. However, there is minimal professional support of rehabilitation across the lifespan. In this paper, we introduce an upper extremity rehabilitation system, the Quality of Movement Feedback-Oriented Measurement System (QM-FOrMS), by integrating cost-effective portable sensors and clinically verified motion quality analysis towards individuals with upper limb motor deficits. Specifically, QM-FOrMS is comprised of an eTextile pressure sensitive mat, named Smart Mat, a sensory can, named Smart Can, and a mobile device. A personalizable and adaptive upper limb rehabilitation program is developed, including both unilateral and bilateral functional activities which can be selected from a list or custom designed to further tailor the program to the individual. Quantitative evaluation of the motor performance from the QM-FOrMS is derived from fine-grained kinematic measurements. We ran a pilot study with three groups, including five baseline subjects (i.e., healthy young adults), six older adults and four individuals with movement impairment. The experimental results show that QM-FOrMS can provide the detailed feature during the unattended rehabilitation exercise, and proposed metrics can distinguish the evaluation results across group.
3D printing is an emerging technique in product manufacturing. Its applications have been expanding vastly in homebased production. Compared to traditional manufacturing techniques, such as Computerized Numeric Control (CNC) machine tools, it is believed that 3D printing is more cost effective in fabricating personalized products. The product cost estimation in 3D printing mainly takes material expenditure into account, and extensive studies have been performed for reducing filament expense or development of recyclable filaments. However, electricity expenditure is another inevitable cost in the 3D printing process yet an omitted factor in the cost estimation. To this end, this paper introduces the first in-depth study to understand the energy consumption in 3D printing. Specifically, our study comprises of two parts. The first part quantifies both material and electricity use in the 3D printing, and find that the electricity takes up to 32% of the total cost. The second part characterizes the energy consumption and identifies the sensitivity of various parameters.We also share insights and potential solutions to optimize the power consumption of 3D printers.
One of the reasons programming mobile systems is so hard is the wide variety of environments a typical app encounters at runtime. As a result, in many cases only post-deployment user testing can determine the right algorithm to use, the rate at which something should happen, or when an app should attempt to conserve energy. Programmers should not be forced to make these choices at development time. Unfortunately, languages leave no way for programmers to express and structure uncertainty about runtime conditions, forcing them to adopt ineffective or fragile ad-hoc solutions. We introduce a new approach based on structured uncer-tainty through a new language construct: the maybe statement. maybe statements allow programmers to defer choices about app behavior that cannot be made at development time, while providing enough structure to allow a system to later adaptively choose from multiple alternatives. Eliminating the uncertainty introduced by maybe statements can be done in a large variety of ways: through simulation, split-testing, user configuration, temporal adaptation, or machinelearning techniques, depending on the type of adaptation appropriate for each situation. Our paper motivates the maybe statement, presents its syntax, and describes a complete system for testing and choosing from maybe alternatives.
Embedded database engines such as SQLite provide a convenient data persistence layer and have spread along with the applications using them to many types of systems, including interactive devices such as smartphones. Android, the most widely-distributed smart-phone platform, both uses SQLite internally and provides interfaces encouraging apps to use SQLite to store their own private structured data. As similar functionality appears in all major mobile operating systems, embedded database performance affects the response times and resource consumption of billions of smartphones and the millions of apps that run on them—making it more important than ever to characterize smartphone embedded database workloads. To do so, we present results from an experiment which recorded SQLite activity on 11 Android smartphones during one month of typical usage. Our analysis shows that Android SQLite usage produces queries and access patterns quite different from canonical server workloads. We argue that evaluating smartphone embedded databases will require anew benchmarking suite and we use our results to outline some of its characteristics.
Allele sharing between modern and archaic hominin genomes has been variously interpreted to have originated from ancestral genetic structure or through non-African introgression from archaic hominins. However, evolution of polymorphic human deletions that are shared with archaic hominin genomes has yet to be studied. We identified 427 polymorphic human deletions that are shared with archaic hominin genomes, approximately 87% of which originated before the Human–Neandertal divergence (ancient) and only approximately 9% of which have been introgressed from Neandertals (introgressed). Recurrence, incomplete lineage sorting between human and chimp lineages, and hominid-specific insertions constitute the remaining approximately 4% of allele sharing between humans and archaic hominins. We observed that ancient deletions correspond to more than 13% of all common (>5% allele frequency) deletion variation among modern humans. Our analyses indicate that the genomic landscapes of both ancient and introgressed deletion variants were primarily shaped by purifying selection, eliminating large and exonic variants. We found 17 exonic deletions that are shared with archaic hominin genomes, including those leading to three fusion transcripts. The affected genes are involved in metabolism of external and internal compounds, growth and sperm formation, as well as susceptibility to psoriasis and Crohn’s disease. Our analyses suggest that these “exonic” deletion variants have evolved through different adaptive forces, including balancing and population-specific positive selection. Our findings reveal that genomic structural variants that are shared between humans and archaic hominin genomes are common among modern humans and can influence biomedically and evolutionarily important phenotypes.
Artifact | Datasets
We model the objective function, that the jobs entering the scheduler have a Poisson’s distribution and the jobs that are sent out from the multilevel feedback scheduler are also distributed as a Poisson’s distribution. We also assume that the number of CPU’s in a processing element is not restricted to one, but rather many CPUs integrated into one PE. Therefore, we assume the M/M/c queue model for our calculations. In Kendall’s notation, we describes a system where arrivals form a single queue and are governed by a Poisson process, where there are c servers and job service times are exponentially distributed. Gridlets provided by the users are assigned to processing elements (PEs), and gridlets whose remaining service time is shifted between queues of the MLFQ scheduler to be completed. In MLFQ, the total architecture is divided into multiple prioritized queues. This approach provides gridlets which starve in the lower priority queue for long time to get resources. As a result, the response time of the starved gridlets decreases and overall turnaround time of the scheduling process decreases. This scheduling policy is simulated using Alea GridSim toolkit to test the performance. The proposed MLFQ scheduling algorithm works better in most of the scenarios when compared to FCFS and PBS_PRO algorithms.