FB Technik
Filtern
Dokumenttyp
Volltext vorhanden
- ja (13) (entfernen)
Gehört zur Bibliographie
- nein (13)
Schlagworte
- Elektrode (2)
- Flüssigkristalline Polymere (2)
- Parkinson-Krankheit (2)
- Rauigkeit (2)
- electrode model (2)
- surface roughness (2)
- Algorithmus (1)
- Astrozyt (1)
- BEV (1)
- Bewegung (1)
- Bewegungsstörung (1)
- Datenausgabe (1)
- Datensatz (1)
- Depression (1)
- Eckenverrundung (1)
- Elektrischer Leiter (1)
- Elektrisches Kabel (1)
- Elektroantrieb (1)
- Elektrokortikogramm (1)
- Elektromobilität (1)
- Elektrophysiologie (1)
- Essenzieller Tremor (1)
- Evolventenverzahnung (1)
- Ganganalyse (1)
- Gehen (1)
- Geometrie (1)
- Halbleiterlaser (1)
- Helix (1)
- Hirnstimulation (1)
- Hochfrequenz (1)
- Hochverzahnung (1)
- Koaxialkabel (1)
- Konfokale Mikroskopie (1)
- Kopfkürzung (1)
- Kupfer (1)
- Laser (1)
- Laser-Rastermikroskopie (1)
- Latenzzeit <Informatik> (1)
- Magnetische Stimulation (1)
- Maschinelles Lernen (1)
- Maschinelles Sehen (1)
- Messung (1)
- Modul (1)
- Monte Carlo simulation (1)
- Monte-Carlo-Simulation (1)
- Mustererkennung (1)
- Nervenkrankheit (1)
- Neuronales Netz (1)
- OLED (1)
- OLED latency (1)
- Organischer Stoff (1)
- Pedografie (1)
- Permutation (1)
- Profilverschiebung (1)
- Psychometrie (1)
- Punktetransfer (1)
- RF attenuation (1)
- Reaktionszeit (1)
- Schrägverzahnung (1)
- Sensor (1)
- Spektroskopie (1)
- Stereokamera (1)
- Stirnrad (1)
- Terahertzbereich (1)
- Tesselation (1)
- Tesselierung (1)
- Transkranielle magnetische Stimulation (1)
- Trochoide (1)
- Verdrillung <Elektrotechnik> (1)
- Verschleißprüfung (1)
- Wälzfräsen (1)
- Zahnkopfspiel (1)
- Zahnrad (1)
- Zahnradherstellung (1)
- Zentralnervensystem (1)
- anomalies in permutations (1)
- astrocytes (1)
- charge storage capacity (CSCc) (1)
- coaxial (1)
- cognitive performance assessment (1)
- computer vision (1)
- confocal microscopy (1)
- control systems (1)
- copper conductors (1)
- cortical electrical stimulation (1)
- cortical stimulation (1)
- deep brain stimulation (1)
- distributed-feedback (1)
- electrical drive (1)
- electrocorticogram (1)
- electrode impedance (1)
- electromobility (1)
- electroplating (1)
- foot pressure sensors (1)
- human gait (1)
- in vivo two-photon laser-scanning microscopy (1)
- including semiconductors (1)
- k-Means-Algorithmus (1)
- laser materials processing (1)
- lasers (1)
- liquid crystal polymer electrodes (1)
- material extrusion (1)
- movement disorders (1)
- neuron-glia interaction (1)
- organic materials (1)
- output impedance (1)
- platinum (1)
- pose estimation (1)
- pulsed current (1)
- reaction time measurement (1)
- recurrent neural networks (1)
- risk of falls (1)
- semiconductor lasers (1)
- sensitivity analysis (1)
- spectral estimation (1)
- spectroscopy (1)
- stereoscopic cameras (1)
- stimulator characterization (1)
- stimulator model (1)
- stl (1)
- terahertz (1)
- transmission phase (1)
- tremor (1)
- twisted-pair (1)
- ultrafast processes in condensed matter (1)
- ultrasound (1)
- visible lasers (1)
- volume holographic gratings (1)
- wear-level monitoring (1)
- wearable motion sensors (1)
Institut
- FB Technik (13)
Abstract: This paper is about detecting the difference between fully-random and semi-random shuffleing data sets, with the use of unsupervised learning algorithms. Because of the limits of the k-means algorithm alone, a recurrent autoencoder is used for feature extraction to improve the results of k-means. In the next step the autoencoder alone is used for clustering.
Introduction: In the last years, machine learning has been used more and more in different areas and it is also appropriate for for pattern recognition in data. Random data is characterized through the missing of defined patterns. Permutations without repetitions have the highest amount of entropy for a sequence of its length, which is similar to random data according to Andrei Kolmogorov, who states that random data have the highest amount of information and can’t be compressed. Therefore, this paper analyses the difference between random permutations and good shuffled permutations, which have some remaining patterns left. This is done via a recurrent autoencoder.