[kand] Perustieteiden korkeakoulu / SCI
Permanent URI for this collectionhttps://aaltodoc.aalto.fi/handle/123456789/12
Browse
Recent Submissions
Now showing 1 - 20 of 4430
- Estimating CMS experiment’s trigger efficiency with gradient-boosting decision tree(2026-03-26) Tran, KhaSchool of Science | Bachelor's thesisThe Compact Muon Solenoid (CMS) is one of the main experiments at the Large Hadron Collider (LHC)—the world’s largest particle accelerator, operated by the European Organisation for Nuclear Physics (CERN). CERN plays a critical role in advancing the study of fundamental particles and their interactions. Inside the LHC at CERN, the particle beams are accelerated and made to collide up to 40 million times per second, returning an amount of data that would overflow the memory capacity. This data challenge requires the CMS to implement a trigger system—a filter system to select the most interesting collision events for further study, while discarding permanently the majority of events. This thesis presents a novel approach to studying trigger efficiency within the CMS framework using a machine learning technique called Gradient-Boosting Decision Tree. Building on previously developed methodology, this work aims to enhance the precision and generalisation of the developed method, to improve the predictive power of the performance of the trigger system. Studying the accuracy of the trigger system is essential for effective data collection during experiments. We introduce an advanced machine learning model that analyses the 2018 datasets recorded by the CMS, enabling a more efficient evaluation of trigger performance. Compared to the previous work, this thesis integrates a larger dataset, data engineering, model architecture selection, and optimisation algorithm. The goal is to find the best model to handle the challenges of high-energy physics data and generalise for future analysis.
- Työttömyys ja maan sisäinen nettomuutto Suomen kunnissa(2026-03-25) Pekonen, SameliSchool of Science | Bachelor's thesisTässä kandidaatintyössä tutkitaan työttömyysasteen ja maan sisäisen nettomuuton välistä korrelaatiota Suomen kunnissa vuosina 1990–2024. Työn tavoitteena on korrelaatioiden perusteella arvioida, voisiko työttömyyden alentaminen olla tehokas keino kunnalle parantaa nettomuuttolukujaan. Yhteyttä tarkastellaan sekä kuntien välisesti että kuntien sisäisesti, mutta kausaalista mallia ei pyritä luomaan. Tutkimusaineisto on kerätty Tilastokeskuksen kuntakohtaisista tilastoista. Tutkimusmenetelmänä käytetään lineaarista regressioanalyysia, jota hyödynnetään sekä kuntien väliseen vertailuun että kuntien sisäisten aikasarjojen tarkasteluun. Analyysissä kunnat jaetaan väkiluvun perusteella kolmeen kokoluokkaan, ja lisäksi keskisuurten kuntien kohdalla huomioidaan kehyskuntien vaikutus. Tulokset osoittavat, että työttömyyden ja nettomuuton välinen korrelaatio on kuntien välillä sekä kunnan sisäisesti lähes aina negatiivinen, joten korkeampi työttömyys on yhteydessä heikompaan nettomuuttoon. Yhteyden tilastollinen selitysvoima on kuitenkin yleisesti ottaen heikko. Kuntien välisessä tarkastelussa yhteys on merkitsevin keskisuurissa kunnissa, mutta tämä merkitsevyys vähenee huomattavasti, kun huomioidaan suurempien kaupunkien kehyskuntien vaikutus tarkasteluun. Kehyskuntana oleminen osoittautuukin olevan työttömyyttä vahvemmin muuttoliikettä selittävä tekijä. Työn johtopäätökseksi saadaan, ettei työttömyyden laskeminen näytä olevan kovin tehokas tapa nostaa kunnan maan sisäisiä nettomuuttolukuja. Vaikka työttömyyden ja nettomuuton välillä on negatiivinen korrelaatio, luotujen mallien selitysasteet jäävät yleisesti pieniksi, sekä kausaalinen yhteys näyttää epätodennäköiseltä, sillä jo yhden ulkoisen muuttujan huomioiminen (kehyskunnat) vähentää työttömyyden vaikutuksen merkitsevyyttä huomattavasti.
- Enhancing time efficiency of self-driven rheological analysis using hyperspectral imaging and machine learning(2026-03-10) Ryynänen, Niklas SebastianSchool of Science | Bachelor's thesisCellulose nanofiber hydrogels are promising bio-based materials with highly tunable mechanical properties, strongly governed by their microstructure. They are explored across aerospace, automotive, environmental and medical industries in varied applications. Conventional rheological characterization with a rheometer, while accurate, is time-consuming and contact-based. Long measurement times combined with sample contact increase risk of contamination and dehydration of the samples, which is undesirable for self-driven laboratory workflows. Hyperspectral imaging offers a non-invasive alternative for capturing spectral signatures influenced by the same underlying microstructure that also shapes the rheology of these CNF hydrogels. It is studied whether machine learning is able to learn the mapping from spectral features to rheological descriptors to accelerate materials characterization. In this thesis, five 1 wt% CNF hydrogels with different fiber length distributions were measured using a transmission hyperspectral imaging setup. Ground-truth rheology was obtained with a rheometer equipped with a serrated Couette geometry. Viscosity flow curves, amplitude sweep and frequency sweep measurements were performed. The frequency response was parameterized by fitting the springpot model to obtain parameters 𝛼 and V. PLSR, Random Forest and XGBoost were trained on spectral features to predict the parameters 𝛼, V, the critical shear stress 𝜏crit and critical storage modulus 𝐺′. The critical shear strain 𝛾crit was then computed from 𝜏crit and 𝐺′. The critical values define the linear viscoelastic region for each sample. PLSR consistently outperformed the tree-based models, explaining on average 79 % of the variance across targets, with a mild underestimation bias. The hyperspectral image acquisition required approximately 1 minute per sample, whereas the complete rheological measurement protocol took 1 hour per sample. The results indicate that combining hyperspectral imaging with machine learning can deliver substantial time savings while maintaining relatively high predictive accuracy, supporting its use as a rapid, non-invasive proxy for CNF hydrogel rheology in self-driven laboratory workflows.
- The impact of gamification on students’ motivation and academic performance in higher education computer science learning(2025-05-09) Anwer, MohammedPerustieteiden korkeakoulu | Bachelor's thesisGamification is the use of game design elements in non-game settings to engage participants and encourage desired behaviour. Gamification has emerged as a transformative approach in computer science (CS) education. This approach aims to enhance student engagement, motivation and learning outcomes by integrating game design elements into their curriculum. The thesis is a literature review that combines insights from multiple studies to examine the impact and implementation methods of gamification in higher education Computer Science teaching and learning. The majority of the empirical evidence suggests that well designed gamification systems can increase students’ engagement and participation while improving learning outcomes and academic performance. Gamification systems in CS education can often result in higher assignment completion rates and improvements in grades or test scores. The theoretical perspective of self-determination theory (SDT) indicates that students can become more deeply invested in their studies when game elements fulfil the students' needs for competence, autonomy and relatedness. This in turn, results in enhanced intrinsic motivation. However, not all studies report uniformly positive results. In some cases, gamification yields no significant performance gain or even weakens motivation if poorly implemented. Landers’ Theory of Gamified Learning identifies sustained practice, peer interaction and immediate feedback as key mediators that convert game elements into enhanced academic outcomes. The best practices for an effective gamified system in a CS higher education course include aligning mechanics with learning objectives, prioritizing meaningful feedback and retry opportunities and integrating all course components into the gamified framework while still balancing extrinsic rewards to support intrinsic motivation. The conclusion of the thesis is that gamification, when applied thoughtfully, is a promising tool to enrich CS education by increasing student engagement and improving academic performance. Careful design and adherence to motivational principles are crucial for its success.
- Determinism and probability(2026-03-11) Strand, AntonSchool of Science | Bachelor's thesisProbability is an important concept used in many different fields. Despite this, there are multiple interpretations of what probability means. According to classical logic, a proposition is either true or false, and its truth value is constant. This means that true propositions have always been true, and if a proposition is not true, it will always be false. If these presuppositions are accepted, it is natural to affirm the theory that reality is as it is, and could not have been otherwise. This is called determinism, a theory which implies that even phenomena that are often considered random, such as nuclear decay, are in fact predetermined down to the smallest detail. Determinism is a theory that impacts how probability is understood, since probability is often closely tied to randomness. The aim of this thesis is to present an interpretation of probability that is consistent with determinism. A good interpretation should follow some mathematical axiomatization of probability. After Kolmogorov laid out his axiomatization it has been common to use it. Countable additivity is often used as an axiom of probability, but the advantage of choosing finite additivity as an axiom instead is that this makes it possible to assign a uniform probability distribution over a countably infinite sample space, which is not possible with countable additivity. There are multiple interpretations of probability in the literature. Some of them are consistent with Kolmogorov’s axiomatization, and some are based on some other axiomatization. In addition to being consistent with a mathematical axiomatization, a good interpretation of probability should also align with how people generally use the term “probability”. The problem with many interpretations is that there are examples which demonstrate that they do not align with how the term “probability” is used. After assessing the validity of different interpretations of probability, their problems and strengths as well as how consistent they are with determinism, it is suggested in this thesis that probability should be understood as the objective degree of warrant for believing a proposition. This interpretation follows Kolmogorov’s axiomatization with the axiom of finite additivity, and it is consistent with both determinism and how the term “probability” is used. If a stochastic variable is used multiple times with the same probability distribution to map to a number, then the value you have the most warrant to believe to be the mean value converges toward the expected value. This is the interpretation of expected value according to the interpretation of probability presented in this thesis. This interpretation of probability is very similar to a so-called evidential interpretation, where probability is understood as the degree to which the given evidence supports a hypothesis. Depending on how the language of the evidential interpretation is understood, these two interpretations might be different ways of speaking about the same interpretation.
- Kasvojentunnistusjärjestelmien huijaaminen ja huijauksilta puolustautuminen(2026-02-22) Nurminen, OlliSchool of Science | Bachelor's thesisKandidaatintyössä tehdään kirjallisuuskatsaus kasvojentunnistusjärjestelmien huijausmenetelmiin ja huijauksien havaitsemismenetelmiin. Tavoitteena on selvittää millaisia huijauksia kasvojentunnistusjärjestelmiä vastaan on, miten tehokkaita ne ovat, ja miten niitä vastaan voidaan puolustautua. Työssä käsittellään fyysisiä ja digitaalisia hyökkäyksiä ja niiltä puolustautumista. Fyysiset hyökkäykset sisältävät yksinkertaiset valokuva- ja videohyökkäykset sekä edistyneemmät laastarihyökkäyksen, silmälasien kehykset, silikonimaskin ja infrapunavalon hyödyntämisen. Digitaalisia hyökkäyksiä ovat kasvojen morfaus, aineistomyrkytys ja syväväärennökset. Hyökkäykset ovat hyvin tehokkaita järjestelmiä vastaan, joissa ei ole mitään huijauksenhavaitsemismenetelmää käytössä. Puolustusmenetelmiä ovat muun muassa elävyyden tunnistaminen ja adversiaalisien esimerkkien lisääminen koulutusaineistoon. Eri puolustusmentelmillä on omat vahvuutensa ja ne toimivat erittäin hyvin tiettyjä hyökkäyksiä vastaan. Tehokas puolustautuminen vaatii useamman puolustusmenetelmän yhdistämistä. Hyökkäykset kehittyvät sitä mukaan, kun niiltä pystytään puolustautumaan. Imitointihyökkäykset ovat kohdistettu yksittäiseen henkilöön ja vaativat teknistä osaamista. Tavallisen henkilön ei tarvitse olla kovin huolissaan tällaisen kohteeksi joutumisesta. järjestelmien kehittäjien on tärkeää tiedostaa haavoittuvuudet ja huomioida ne järjestelmien suunnittelussa.
- Pienydinvoiman turvallisuus: Oppeja muilta turvallisuuskriittisiltä aloilta(2026-02-17) Brade, EliasSchool of Science | Bachelor's thesisTämän opinnäytetyön tavoitteena oli selvittää, miten pienten modulaaristen ydinvoimaloiden (SMR, engl. small modular reactor) turvallisuussuunnittelussa voitaisiin soveltaa muiden turvallisuuskriittisten alojen turvallisuusoppeja. Työ tehtiin perehtymällä ilmailun, merialueiden öljyn- ja kaasunporauksen sekä terveydenhuollon onnettomuusraportteihin ja turvallisuusoppeihin. SMR-laitosten suurin eroavaisuus verrattuna perinteisiin ydinvoimalaitoksiin on lisääntynyt passiivinen turvallisuus. Ne voidaan suunnitella siten, ettei sähköä tai käyttäjää tarvita reaktorin turvallisuuden varmistamiseksi. Tekniikka ei kuitenkaan poista inhimillisten tekijöiden merkitystä. SMR-laitosten alhainen miehitys, mahdollinen etäohjaus ja muut SMR-voimaloiden erityispiirteet tuovat uusia haasteita turvallisuuskulttuuriin ja inhimillisen virheen torjuntaan verrattuna tavanomaisiin voimalaitoksiin. Ilmailu keskeinen oppi oli miehistön resurssienhallintakoulutus ja skenaarioihin perustuva simulaatiokoulutus, jotka parantavat tiimityötä ja päätöksentekoa häiriötilanteissa. Terveydenhuollosta ja ilmailusta keskeinen oppi oli oikeudenmukainen ja syyllistämätön organisaatiokulttuuri. Merellisen öljyn- ja kaasunporauksen turvallisuusoppi on kaikkien kohteessa toimivien tahojen yhteinen sitoutuminen turvallisuuden kehittämiseen ja selkeä vastuunjako. Johtopäätös oli, että SMR-laitosten turvallisuuden kehittämisessä voidaan soveltaa muiden alojen käytäntöjä yhdistettynä ydinvoiman omaan säätelyyn ja tekniikkaan. Useaa työssä käsiteltyä turvallisuusoppia edellytetään sääntelyssä ja sovelletaan jo valmiiksi ydinvoimaloiden turvallisuusjärjestelmässä.
- Local minima of the sphere packing function(2025-11-28) Heiskanen, AslaPerustieteiden korkeakoulu | Bachelor's thesisWe investigate a 1970s' series of papers by Fields detailing a concept for local minima of the lattice sphere packing function. We observe that Fields' concept of a "fragile lattice" essentially corresponds to the more recent notion of a weakly eutactic lattice. We also argue that fragile lattices are not a very natural opposite concept to extreme lattices, despite this being the original motivation: extreme lattices, too, become trivially fragile when an artificial restriction is removed from the definition. We introduce Fields' ideas - which correspond to the more modern concepts of minimal classes and weak eutaxy properties - as a part of a general introduction to lattice theory and sphere packings. This culminates in a proof of Voronoi's characterization of extreme lattices. We then investigate two conjectures about automorphisms set by Fields in his papers. The first conjecture is disproved by a simple counterexample. Attempts to disprove the second conjecture lead us to discover the existence of perfect forms with trivial automorphism groups.
- Impact of wall temperature on gas dynamics in the JET subdivertor(2026-03-02) Arola, JuhoSchool of Science | Bachelor's thesisTokamak-type fusion reactors are a possible source of abundant clean energy, but among other issues, there is still incomplete understanding of the vacuum pumping systems. To control the plasma density and, in future fusion power plants, to remove helium ash, neutral particles are removed by a cryopump through a structure called a subdivertor. The pumping needs to be well understood, efficient, and as controlled as possible. In this thesis, the impact of the wall temperature on the pumping and the conditions in the subdivertor is investigated. The study is conducted with a Boltzmann-equation Monte Carlo solver called EIRENE. To verify the simulations, the conductance calculated by EIRENE in a simplified simulation is compared to analytical solutions with ideal gas assumptions. The simulations are conducted in a toroidally symmetric approximation of the subdivertor geometry in JET with uniform wall temperatures, with the exception of a cool pump surface. The simplified geometry for EIRENE validation is a 2 m×1 m×9 m box with the source on the bottom, the pump at the top, and a thin aperture in the middle. Both uniform and non-uniform wall temperatures are tested. The results demonstrate that when the wall temperature is increased from 373 K to 1160 K, the pressure and the pressure gradient near the louvres increase by a factor of approximately 1.7. The difference between the wall and cryopump temperatures also affected the pressure and caused non-isothermal conditions. The EIRENE simulations in the simplified geometry produce conductance estimates that differ from the analytical results by a constant factor of approximately 1.35 for small apertures. With large apertures, the values exceed the analytical solution by up to a factor of 2. The gas temperature, and consequently the wall temperatures, are an important factor when comparing pressure values. Also, pressure gauges in experimental reactors should be coupled with temperature sensors as temperature greatly affects the measured values. EIRENE provides values on conductance in restricted geometries and conditions within a factor of 1.35, but further studies are required to understand the constant factor in relation to analytical solutions
- Effects of viscosity on swimming kinematics and symmetry breaking in micro- to mesoscale swimmers(2025-11-30) Balthasar, FridaPerustieteiden korkeakoulu | Bachelor's thesisSwimming strategies vary across different regimes of Reynolds numbers as the significance of viscous and inertial forces changes. At the macroscale, inertia contributes substantially to propulsion, while locomotion at the microscale requires non-reciprocal motion, a breaking of the time-symmetry. In between the well understood micro- and macroscales lies the mesoscale regime, where both viscous and inertial forces are significant and swimming remains less fully understood. This intermediate regime is particulary interesting, as neither purely viscous nor entirely inertial models can accurately describe it. Artemia is a micro- to mesoscale swimmer that transitions across Reynolds number during its growth and development. In this thesis, the effects of an increase in viscosity on the swimming kinematics and symmetry-breaking of Artemia was investigated. This was done by capturing high-speed recordings of Artemia swimming in brine and polyvinyl pyrrolidone (PVP) solution, and analysing the resulting footage using deep neural network based image tracking software (DeepLabCut). By increasing the viscosity of the surrounding medium, the Reynolds numbers of the Artemia were artificially shifted down to levels the Artemia would otherwise not experience at their body size and developmental stage. The swimming Reynolds numbers where reduced from 1-7 in brine to below 1 in PVP. Under these conditions, decreases in antenna tip velocity and stroke amplitude were observed, while swimming frequency remained unaffected. A decrease in symmetry-breaking level was also noted, which indicates a shift to more reciprocal motion as viscosity increases. These results provide some insight into how real micro- to mesoscale swimmers, like Artemia, adapt to transitions in Reynolds number due to changes in viscosity.
- Kvasikonformikuvaukset ja tasainen tiheysominaisuus(2026-02-26) Toivonen, SamuelSchool of Science | Bachelor's thesisTyön aiheena ovat kaksi funktioiden (tarkemmin ottaen homeomorfismien) ominaisuutta, kvasikonformisuus sekä tasainen tiheysominaisuus. Molemmat ominaisuudet liittyvät siihen, miten kuvaukset muuttavat pienten joukkojen ominaisuuksia. Käytännössä työssä keskitytään pienten pallojen tarkasteluun. Kvasikonformikuvaukset venyttävät infinitesimaalisen pieniä palloja ääreellisen määrän, kun taas tasainen tiheysominaisuus kertoo funktion tiheyden muuttuvan jatkuvalla tavalla jokaisen pisteen ympäristössä. Työ käsittelee F.W. Gehringin ja J.C. Kellyn artikkelissa [3] esiteltyä todistusta, jonka perusteella kvasikonformius ja tasainen tiheysominaisuus ovat ekvivalentteja funktioiden ominaisuuksia. Työssä laajennetaan todistuksen osa, joka osoittaa tasaisen tiheysominaisuuden seuraavan kvasikonformisuudesta ja kirjoitetaan se kandidaattitason opiskelijalle ymmärrettävään muotoon. Tärkeimpiä matemaattisia työkaluja ja tuloksia, joita työssä käytetään, ovat Lebesguen mitta, L^n-integroituvuus, funktion maksimivenyminen ja yleistetty Jacobiaani, sekä Hölderin epäyhtälö. Todistus on tehty R^n-avaruuksissa, joissa n ≥ 2.
- Lumped-element resonator in a hermetically sealed cavity(2026-02-25) Mojica, MatiasSchool of Science | Bachelor's thesisAs quantum computing advances towards practical implementation, preserving coherence in superconducting circuits remains a central challenge in developing scalable systems. Advances in materials science and fabrication techniques have significantly improved component quality over the past decade. In particular, treatments such as chemical etching and protective coatings have demonstrated reductions in two-level system (TLS) losses by reducing oxide-related dissipation. However, these methods are insufficient to prevent re-oxidation when devices are exposed to the atmosphere between fabrication and measurement, introducing decoherence channels and hindering reproducibility across cooldown cycles. This thesis addresses these issues by fabricating and evaluating a hermetically sealed three-dimensional aluminum cavity capable of preserving vacuum and preventing oxidation during cooldown cycling. We begin by fabricating the cavity, after which we test its vacuum integrity using a helium leak detector and its structural integrity under repeated thermal changes. After this, we characterize test samples and assess the cavity’s suitability for standardized quality factor and two-level system loss measurements. The results show that the cavity remains sealed across varying temperatures and supports reliable characterization measurements. These results demonstrate that hermetically sealed cavities present a promising strategy for reducing decoherence and improving the reproducibility of superconducting qubit experiments, although further research is warranted to evaluate the long-term viability of the device.
- Machine learning methods for droplet analysis(2025-12-12) Vuorio, MiikaSchool of Science | Bachelor's thesisMeasuring wetting-related surface properties is essential for many industrial and scientific applications. For example, in the active research field of digital microfluidics, where electrode grids are employed to efficiently and precisely control the location of a droplet, candidate surfaces must exhibit high hydrophobicity and low surface heterogeneity. The current standard methodology for characterising these properties—contact angle goniometry—suffers from low throughput and is infeasible to implement at the end of a production line, e.g., for the aforementioned digital microfluidic chips. Consequently, developing more efficient methods for measuring surface properties of materials would facilitate industrial scalability. Despite the recent success of machine learning techniques in various scientific and industrial applications, their use in analysing wetting characteristics of materials has remained relatively unexplored. To address this need, this thesis develops machine learning models for analysing surface wetting. First, a custom convolutional neural network (CNN) for localising droplets on a surface was developed and trained on synthetic data. The performance of this model was then compared to a contemporary object recognition model, YOLOv5, that was fine-tuned to draw bounding boxes around droplets. Finally, outputs of YOLOv5 from 24 frames of experimental video were converted into masks and combined with the original video to serve as input to a ResNet-based model trained to predict surface heterogeneity directly. The custom CNN achieved reasonable accuracy with a mean Euclidean distance of 5.34 pixels between the predicted droplet centre and the ground truth on the synthetic dataset. However, it was outperformed by the fine-tuned YOLOv5 model, which had a mean Euclidean error of 1.31 pixels and more robust performance on experimental data. Finally, the ResNet-based model was able to characterise large differences in surface heterogeneity but lacked the sensitivity to distinguish between materials with a small difference in contact angle hysteresis. These results demonstrate the feasibility of machine learning methods for high-throughput material characterisation.
- A comparison of liquid nitrogen and liquid helium as cryogens for cryo-transmission electron microscopy by using ultramicrotomed sections of soft matter specimens(2026-02-07) Hanski, LottaSchool of Science | Bachelor's thesisFor electron beam-sensitive samples, such as soft matter and biological specimens, radiation damage is the biggest limiting factor when it comes to achieving high-quality images in electron microscopy. Liquid nitrogen is widely used as a cryogen to attempt to minimize electron beam damage by cooling the sample to cryogenic temperatures. However, there have been varied results about the efficacy of slowing down the deterioration of the structure of the sample by further decreasing the temperature by using liquid helium as a coolant. The aim of this study is to examine how liquid helium compares to liquid nitrogen as a cryogen in cryogenic transmission electron microscopy when it comes to reducing radiation damage and achieving high-quality images of ultramicrotomed sections of three different specimens: a soft matter specimen of a 1:1 ratio of P4VP and PDP, a block copolymer with a liquid crystal layer and poly(ethylene oxide) (PEO). In this thesis, the extent of radiation damage occurring on the liquid crystal and P4VP(PDP) is studied in liquid nitrogen and liquid helium temperatures by analyzing fading Fast Fourier Transform-patterns from different regions of interest on the specimens. It is determined that liquid helium cooling decelerates radiation damage of P4VP(PDP) and the liquid crystal by factors of 1.6 and 1.5 respectively. Furthermore, the destruction of PEO under liquid helium and liquid nitrogen temperatures is studied qualitatively, by visually studying image sets acquired from different regions of interest. It is determined that liquid helium cooling significantly reduces radiation damage of the PEO specimen, by a factor of 3.5-4.
- Towards magneto-ionic control of spin waves: A pathway to energy-efficient wave-based computing(2025-12-26) Zinetti, FilippoSchool of Science | Bachelor's thesisMagnonic computing represents a new paradigm in information processing, which leverages magnetic excitations called spin waves (SWs) as information carriers. Typically, control over spin waves is achieved via multiple conversion steps, which lead to parasitic losses. To overcome this, a magneto-ionic approach could enable direct, low power, and high-efficiency coupling of voltage to magnetic properties. Despite this prospect, there exists currently no example of magneto-ionic control of spin waves due to material incompatibilities. Thus, this thesis tests and evaluates the potential of a specific ferromagnet-metal combination—yttrium iron garnet (YIG) and platinum (Pt)—as lowermost layers for a future magneto-ionic platform. The annealing of these materials is studied and its parameters are optimised to ensure simultaneous YIG crystallisation and Pt preservation. Finally, this work reports the first monocrystalline YIG/Pt heterostructure in which both layers are grown at room temperature. Remarkably, these experiments also show the potential for spin wave propagation on YIG intersecting over a 10-um-wide Pt stripe with nanopatterned structures. These findings highlight fundamental challenges for future monolithic devices, which require both single-crystal YIG and lithographic patterns. Moreover, this study provides relevant insights towards the fabrication of a magneto-ionic device for spin wave control.
- Performance evaluation of L4S in 5G networks(2025-12-15) Lassila, RobiSchool of Science | Bachelor's thesisThis bachelor thesis examines a new technology called Low Latency, Low Loss and Scalable Throughput (L4S) in 5G mobile networks. As the name suggests, L4S allows for a low latency and packet loss with scaling data rates. The topic is important as the number of devices required to have an internet connection rises. In addition, this is important for future applications that require a low latency, for example XR applications. The purpose of this thesis is to evaluate the performance of L4S in 5G mobile networks compared to older congestion control algorithms. This thesis explains the concept of L4S and its functionality based on available literature and the performance is measured with field testing in 5G mobile networks. In the test results, we found a significant difference in the latency when comparing L4S enabled versus disabled. On average, there was a 10 millisecond improvement in the latencies when the network experienced congestion. With L4S disabled the latencies were approximately 20 milliseconds compared to 10 milliseconds when L4S was enabled. The latency improvement is approximately 50 \%, which is quite significant. However, we also found the throughput of the non L4S flow was significantly lower when L4S was enabled. In conclusion, we found that L4S has the capacity and potential to improve latencies, which is important especially for low latency applications. The results suggest further development and improvement of L4S.
- Käyttäjän yksityisyyden kokemus puhekäyttöliittymissä(2026-01-23) Ypyä, LinneaSchool of Science | Bachelor's thesisThe increased use of voice user interfaces has raised concerns about users’ privacy. Previous research has shown that users’ subjective experiences of privacy do not always correspond to the objective level of privacy provided by a system. This thesis examines users’ subjective experiences of privacy in voice user interfaces. The aim of the study is to investigate how interface features, interaction, and the usage of environment and context shape the experience of privacy. The study is conducted as a literature review. As a theoretical framework, the thesis draws on Sandra Petronio’s Communication Privacy Management (CPM) theory and Helen Nissenbaum’s theory of Contextual Integrity. In addition, the experience of privacy is examined from the perspective of privacy-supporting design principles. The results indicate that users’ sense of control and the transparency of system operation play a central role in shaping the experience of privacy. By strengthening users’ sense of control and the predictability of system behavior, it is possible to reduce fears of surveillance and the resulting behavioral constraints. The experience of privacy is particularly weakened in situations where data flows, data-sharing practices, and data recipients are unclear. Voice user interfaces should therefore support the experience of privacy through active, dialog-based communication that helps users understand system operations and maintain a sense of control.
- Improving quality of LLM driven annotation of political data(2026-01-20) Onnela, MarkusSchool of Science | Bachelor's thesisLarge language models are increasingly used to annotate political text, as they can offer gains in scalability compared to traditional human annotators. However, use of these models has produced concerns about annotation quality, reproducibility of studies and the introduction of systematic biases into the complex and often ambiguous task of labelling political concepts. This thesis surveys methods for improving political data annotation, focusing on prompt design, annotation workflows, evaluation metrics, and their respective risks. The reviewed studies suggest that no single model or prompting strategy can ensure annotation quality, with scientifically effective use of LLMs requiring methodological transparency, with a focus on constraining the degrees of freedom available when working with Large Language Models to ensure reproducibility.
- Tekoälyn rooli uuden tuotteen kehitysprosessissa tiedonhallinnan näkökulmasta(2025-12-24) Ruska, PyrySchool of Science | Bachelor's thesisTässä kandidaatintyössä tarkastellaan, miten tekoäly muuttaa uuden tuotteen tuotekehitysprosessia (New Product Development, NPD) tiedon näkökulmasta. Työn tavoitteena on muodostaa kirjallisuuskatsaus siitä, millaista tietoa tuotekehityksen eri vaiheissa syntyy, miten tieto jalostuu ja siirtyy sekä millä tavoin tekoäly voi tukea, muuttaa tai haastaa näitä prosesseja. Työ toteutetaan kirjallisuuskatsauksena, joka yhdistää eri tuotekehitysmallien (mm. Stage-Gate, Agile, Lean Startup, Design Thinking), tiedonhallinnan ja poikkifunktionaalisen yhteistyön (mm. hiljainen vs. eksplisiittinen tieto, rajapinta- ja tulkintahaasteet) sekä tekoälyn sovellusten ja muotojen tutkimuskirjallisuutta. Tulosten perusteella tuotekehitys voidaan jäsentää neljään yleiseen vaiheeseen: (1) ideointi ja konseptointi, (2) suunnittelu ja prototypointi, (3) testaus ja validointi sekä (4) kaupallistaminen ja jatkuva kehitys. Näissä vaiheissa tiedon luonne muuttuu hajautuneesta ja tulkinnanvaraisesta kohti mitattavaa, dokumentoitua ja eksplisiittistä tietoa. Samalla keskeisiksi haasteiksi korostuvat tiedon pirstoutuminen, monitulkintaisuus, tiedon rajapintaongelmat sekä päätöksenteon vinoumat. Tekoäly voi auttaa vastaamaan näihin haasteisiin erityisesti jalostamalla strukturoimatonta tietoa, tukemalla teknisten riippuvuuksien mallinnusta, tehostamalla poikkeamien tunnistamista testidatasta sekä vahvistamalla markkina- ja käyttödataan perustuvaa oppimista kaupallistamisvaiheessa. Toisaalta tekoälyn hyödyntäminen edellyttää laadukasta dataa eikä sen avulla kyetä vastaamaan kaikkiin haasteisiin. Lisäksi tekoälyn rooli tuotekehityksessä voidaan jakaa prosessien automaation, kognitiivisen oivalluksen ja kognitiivisen sitouttamisen avulla. Näiden rooli ei ole yhtenäinen koko tuotekehitysprosessin ajan, vaan niiden eri roolit korostuvat eri tuotekehityksen vaiheissa.
- Vihreä vety investointikohteena(2026-01-15) Rusanen, PihlaSchool of Science | Bachelor's thesisVihreä vety on lupaava vaihtoehto energia-alan päästöjen vähentämiseksi. Se voi auttaa tasapainottamaan uusiutuvan energian vaihtelevaa tuotantoa ja pienentämään riippuvuutta venäläisestä fossiilienergiasta. Erityistä hyötyä voidaan saavuttaa hyödyntämällä vihreää vetyä vaikeasti sähköistettävillä toimialoilla teollisuudessa ja liikenteessä. Vihreän vedyn haasteena on kuitenkin kustannuskilpailukyvyn ja infrastruktuurin puute sekä hankkeisiin liittyvät teknologia-, sääntely- ja kassavirtariskit. Erityisesti hankkeiden korkea riskitaso asettaa haasteita niiden rahoitukselle, mikä on keskeinen este investointien toteutumiselle ja koko alan kehitykselle. Tässä kandidaatintyössä pyritään selvittämään, millainen investointikohde vihreä vety on ja miten vetyinvestointien kiinnostavuutta voitaisiin parantaa. Työ on kirjallisuuskatsaus. Koska erityisesti korkea riskitaso johtaa rahoittajien epäröintiin, työ keskittyy investointien riskeihin ja keinoihin pienentää niitä. Vihreä vety on riskipitoinen investointikohde. Vihreän vedyn hankkeiden kiinnostavuutta voidaan kuitenkin parantaa erilaisilla riskien vähentämisen keinoilla, kuten julkisen ja yksityisen sektorin kumppanuussopimuksilla, vihreillä velkakirjoilla ja myyntisopimuksilla. Nämä keinot voivat olla tärkeässä roolissa sijoittajien houkuttelemisessa hankkeisiin ja siten alan kehityksen edistämisessä.