H. Nguyen
3 records found
1
These subjects were studied in literature, reviewing existing and upcoming valuation prac- tices in real-estate, steps needed to perform machine learning tasks, architecture to support big data processing, and concept drift. This resulted in a design made up of four different components: An ETL and data processing component, a modelling component, a Kafka con- nector, and a client-facing API. An important part to ensure efficiency and scalability of the system is the implementation of concept drift: models are only retrained when the distribu- tion of the target training value has changed significantly.
These components use storage in the form of a Postgres database, disk storage and Elastic Search logs. The logs (on model performance and concept drift usage) can be interpreted through a Grafana dashboard, which is editable through its own GUI.
Finally, to test the success of the project, a testing plan was set up and the code was reviewed by an external group (SIG). The code achieved all the testing milestones and received a 4.5/5 in a mid-development review on maintainability. With this project, the concept of automated valuation models inside GeoPhy’s new architecture has been tested and proved and the project is ready to be further developed and used in practice.
The CMS Hadron Calorimeter in the barrel, endcap and forward regions is fully commissioned. Cosmic ray data were taken with and without magnetic field at the surface hall and after installation in the experimental hall, hundred meters underground. Various measurements were also performed during the few days of beam in the LHC in September 2008. Calibration parameters were extracted, and the energy response of the HCAL determined from test beam data has been checked.
@enCommissioning studies of the CMS hadron calorimeter have identified sporadic uncharacteristic noise and a small number of malfunctioning calorimeter channels. Algorithms have been developed to identify and address these problems in the data. The methods have been tested on cosmic ray muon data, calorimeter noise data, and single beam data collected with CMS in 2008. The noise rejection algorithms can be applied to LHC collision data at the trigger level or in the offline analysis. The application of the algorithms at the trigger level is shown to remove 90% of noise events with fake missing transverse energy above 100 GeV, which is sufficient for the CMS physics trigger operation.
@en