Loading...
Please wait, while we are loading the content...
Similar Documents
Lagrangian Relaxation Neural Network for Unit Commitment
| Content Provider | Semantic Scholar |
|---|---|
| Author | Luh, Peter B. Wang, Yajun Zhao, Xing |
| Copyright Year | 1999 |
| Abstract | This paper presents a novel method for unit commitment by synergistically combining Lagrangian relaxation for constraint handling with Hopfield-type recurrent neural networks for fast convergence to the minimum. The key idea is to set up a Hopfieldtype network using the negative dual as its energy function. This network is connected to “neuron-based dynamic programming modules” that make full use of the DP structure to solve individual unit subproblems. The overall network is proved to be stable, and the difficulties in handling integer variables, subproblem constraints, and subproblem local minima plaguing current neural network methods are avoided. Unit commitment solutions are thus natural results of network convergence. Software simulation using data sets from Northeast Utilities demonstrates that the results are much better than what has been reported in the neural network literature, and the method can provide near-optimal solutions for practical problems. Furthermore, the method has the potential to be implemented in hardware with much improved quality and speed. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | http://www.engr.uconn.edu/msl/paper/xing/PES99WIN.pdf |
| Language | English |
| Access Restriction | Open |
| Subject Keyword | Artificial neural network Biological Neural Networks Computer simulation Convergence (action) Dual Dynamic programming Handling (Psychology) Hopfield network Integer (number) Lagrangian relaxation Linear programming relaxation Mathematical optimization Maxima and minima Network convergence Neural Network Simulation Neuron Preparation Recurrent neural network Solutions Synergy |
| Content Type | Text |
| Resource Type | Article |