Web cookies (also called HTTP cookies, browser cookies, or simply cookies) are small pieces of data that websites store on your device (computer, phone, etc.) through your web browser. They are used to remember information about you and your interactions with the site.
Purpose of Cookies:
Session Management:
Keeping you logged in
Remembering items in a shopping cart
Saving language or theme preferences
Personalization:
Tailoring content or ads based on your previous activity
Tracking & Analytics:
Monitoring browsing behavior for analytics or marketing purposes
Types of Cookies:
Session Cookies:
Temporary; deleted when you close your browser
Used for things like keeping you logged in during a single session
Persistent Cookies:
Stored on your device until they expire or are manually deleted
Used for remembering login credentials, settings, etc.
First-Party Cookies:
Set by the website you're visiting directly
Third-Party Cookies:
Set by other domains (usually advertisers) embedded in the website
Commonly used for tracking across multiple sites
Authentication cookies are a special type of web cookie used to identify and verify a user after they log in to a website or web application.
What They Do:
Once you log in to a site, the server creates an authentication cookie and sends it to your browser. This cookie:
Proves to the website that you're logged in
Prevents you from having to log in again on every page you visit
Can persist across sessions if you select "Remember me"
What's Inside an Authentication Cookie?
Typically, it contains:
A unique session ID (not your actual password)
Optional metadata (e.g., expiration time, security flags)
Analytics cookies are cookies used to collect data about how visitors interact with a website. Their primary purpose is to help website owners understand and improve user experience by analyzing things like:
How users navigate the site
Which pages are most/least visited
How long users stay on each page
What device, browser, or location the user is from
What They Track:
Some examples of data analytics cookies may collect:
Page views and time spent on pages
Click paths (how users move from page to page)
Bounce rate (users who leave without interacting)
User demographics (location, language, device)
Referring websites (how users arrived at the site)
Here’s how you can disable cookies in common browsers:
1. Google Chrome
Open Chrome and click the three vertical dots in the top-right corner.
Go to Settings > Privacy and security > Cookies and other site data.
Choose your preferred option:
Block all cookies (not recommended, can break most websites).
Block third-party cookies (can block ads and tracking cookies).
2. Mozilla Firefox
Open Firefox and click the three horizontal lines in the top-right corner.
Go to Settings > Privacy & Security.
Under the Enhanced Tracking Protection section, choose Strict to block most cookies or Custom to manually choose which cookies to block.
3. Safari
Open Safari and click Safari in the top-left corner of the screen.
Go to Preferences > Privacy.
Check Block all cookies to stop all cookies, or select options to block third-party cookies.
4. Microsoft Edge
Open Edge and click the three horizontal dots in the top-right corner.
Go to Settings > Privacy, search, and services > Cookies and site permissions.
Select your cookie settings from there, including blocking all cookies or blocking third-party cookies.
5. On Mobile (iOS/Android)
For Safari on iOS: Go to Settings > Safari > Privacy & Security > Block All Cookies.
For Chrome on Android: Open the app, tap the three dots, go to Settings > Privacy and security > Cookies.
Be Aware:
Disabling cookies can make your online experience more difficult. Some websites may not load properly, or you may be logged out frequently. Also, certain features may not work as expected.
Topics: Laplacian Eigenmaps, Orthogonal Polynomials, Quantum Information
Participants: Farabie Akanda; Haverford College
Elijah Anderson; Wesleyan University
Elizabeth Athaide; Massachusetts Institute of Technology
Faye Castro; Texas State University
Sara Costa; University of Hartford
Leia Donaway; Swarthmore College
Hank Ewing; Appalachian State University
Caleb Findley; University of Texas at Arlington
August Noë; University of California Santa Cruz
Sam Trombone; Hamilton College
Kai Zuang; Brown University
and
John Ackerman; UConn
Mentors: Bernard Akwei, Rachel Bailey, Maxim Derevyagin, Luke Rogers, Alexander Teplyaev
publication: Rachel Bailey, Sara Costa, Maxim Derevyagin, Caleb Findley & Kai Zuang. Hamiltonians that realize perfect quantum state transfer and early state exclusion. Quantum Information Processing24, 51 (2025). https://doi.org/10.1007/s11128-025-04667-z
REU participants:
Bobita Atkins, Massachusetts College of Liberal Arts
Ashka Dalal, Rose-Hulman Institute of Technology
Natalie Dinin, California State University, Chico
Jonathan Kerby-White, Indiana University Bloomington
Tess McGuinness, University of Connecticut
Tonya Patricks, University of Central Florida
Genevieve Romanelli, Tufts University
Yiheng Su, Colby College
Mentors: Bernard Akwei, Rachel Bailey, Luke Rogers, Alexander Teplyaev
Eigenmaps are important in analysis, geometry and machine learning, especially in nonlinear dimension reduction.
Versions of the Laplacian eigenmaps of Belkin and Niyogi are a widely used nonlinear dimension reduction technique in data analysis. Data points in a high dimensional space \(\mathbb{R}^N\) are treated as vertices of a graph, for example by taking edges between points separated by distance at most a threshold \(\epsilon\) or by joining each vertex to its \(k\) nearest neighbors. A small number \(D\) of eigenfunctions of the graph Laplacian are then taken as coordinates for the data, defining an eigenmap to \(\mathbb{R}^D\). This method was motivated by an intuitive argument suggesting that if the original data consisted of \(n\) sufficiently well-distributed points on a nice manifold \(M\) then the eigenmap would preserve geometric features of \(M\).
Several authors have developed rigorous results on the geometric properties of eigenmaps, using a number of different assumptions on the manner in which the points are distributed, as well as hypotheses involving, for example, the smoothness of the manifold and bounds on its curvature. Typically, they use the idea that under smoothness and curvature assumptions one can approximate the Laplace-Beltrami operator of \(M\) by an operator giving the difference of the function value and its average over balls of a sufficiently small size \(\epsilon\), and that this difference operator can be approximated by graph Laplacian operators provided that the \(n\) points are sufficiently well distributed.
In the present work we consider several model situations where eigen-coordinates can be computed analytically as well as numerically, including the intervals with uniform and weighted measures, square, torus, sphere, and the Sierpinski gasket. On these examples we investigate the connections between eigenmaps and orthogonal polynomials, how to determine the optimal value of \(\epsilon\) for a given \(n\) and prescribed point distribution, and the dependence and stability of the method when the choice of Laplacian is varied. These examples are intended to serve as model cases for later research on the corresponding problems for eigenmaps on weighted Riemannian manifolds, possibly with boundary, and on some metric measure spaces, including fractals.
Approximation of the eigenmaps of a Laplace operator depends crucially on the scaling parameter \(\epsilon\). If \(\epsilon\) is too small or too large, then the approximation is inaccurate or completely breaks down. However, an analytic expression for the optimal \(\epsilon\) is out of reach. In our work, we use some explicitly solvable models and Monte Carlo simulations to find the approximately optimal value of \(\epsilon\) that gives, on average, the most accurate approximation of the eigenmaps.
Our study is primarily inspired by the work of Belkin and Niyogi “Towards a theoretical foundation for Laplacian-based manifold methods.”
Results are presented at the 2023 Young Mathematicians Conference (YMC) at the Ohio State University, a premier annual conference for undergraduate research in mathematics, and at the 2024 Joint Mathematics Meetings (JMM) in San Francisco, the largest mathematics gathering in the world.
Bobita Atkins
Ashka Dalal
Natalie Dinin
Jonathan Kerby-White
Tess Mcguinness
Tonya Patricks
Genevieve Romanelli
Yiheng Su
Working with Professor Sasha Teplyaev
Yiheng and Jonathan share their results.
Vievie and Tonya present their work.
Bobita, Ashka, and Natalie explain random eigencoordinates.
Overview: We study and simulate on computers the fractional Gaussian fields and their discretizations on surfaces like the two-dimensional sphere or two-dimensional torus. The study of the maxima of those processes will be done and conjectures formulated concerning limit laws. Particular attention will be paid to log-correlated fields (the so-called Gaussian free field).
The REU students and mentor work with Professor Baudoin
A box-ball system is a collection of discrete time states representing a permutation,
on which there is an action called a BBS move. After a finite number of BBS moves
the system decomposes into a collection of soliton states; these are weakly
increasing and invariant under BBS moves. The students proved that when this
collection of soliton states is a Young tableau or coincides with a partition of a type
described by Robinson-Schensted (RS), then it is an RS insertion tableau. They also
studied the number of steps required to reach this state.
In practice, financial models are not exact — as in any field, modeling based on real data introduces some degree of error. However, we must consider the effect error has on the calculations and assumptions we make on the model. In complete markets, optimal hedging strategies can be found for derivative securities; for example, the recursive hedging formula introduced in Steven Shreve’s “Stochastic Calculus for Finance I” gives an exact expression in the binomial asset model, and as a result the unique arbitrage-free price can be computed at any time for any derivative security.
In incomplete markets this cannot be accomplished; one possibility for computing optimal hedging strategies is the method of sequential regression. We considered this in discrete-time; in the (complete) binomial model we showed that the strategy of sequential regression introduced by Follmer and Schweizer is equivalent to Shreve’s recursive hedging formula, and in the (incomplete) trinomial model we both explicitly computed the optimal hedging strategy predicted by the Follmer-Schweizer decomposition and we showed that the strategy is stable under small perturbations.
Publication “Stability and asymptotic analysis of the Föllmer–Schweizer decomposition on a finite probability space” Involve, a Journal of Mathematics, v.13, 2020doi.org/10.2140/involve.2020.13.607