PROJECT TITLE :
Sequential and decentralized estimation of linear-regression parameters in wireless sensor networks
Sequential estimation of a vector of linear-regression coefficients is considered below each centralized and decentralized setups. In sequential estimation, the amount of observations used for estimation is determined by the observed samples and hence is random, as opposed to fixed-sample-size estimation. Specifically, once receiving a new sample, if a target accuracy level is reached, we have a tendency to stop and estimate using the samples collected so so much; otherwise we tend to continue to receive another sample. It is known that finding an optimum sequential estimator, which minimizes the common observation number for a given target accuracy level, is an intractable problem with a general stopping rule that depends on the entire observation history. By properly limiting the search space to stopping rules that rely on a particular subset of the complete observation history, we tend to derive the optimum sequential estimator within the centralized case via optimal stopping theory. However, finding the optimum stopping rule during this case needs numerical computations that quadratically scale with the quantity of parameters to be estimated. For the decentralized setup with stringent energy constraints, under an alternate downside formulation that is conditional on the observed regressors, we tend to first derive a simple optimum theme with a well-defined one-dimensional stopping rule regardless of the amount of parameters. Then, following this simple optimum scheme, we propose a decentralized sequential estimator whose computational complexity and energy consumption scale linearly with the number of parameters. Specifically, in the proposed decentralized scheme a shut-to-optimum average stopping-time performance is achieved by sometimes transmitting a single pulse with terribly short duration.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here