Gå offline med appen Player FM !
Episode 80 - Kimberly Ayers
Manage episode 344857112 series 1516226
Evelyn Lamb: Hello and welcome to My Favorite Theorem, the math podcast with no quiz at the end. My name is Evelyn Lamb. I'm a freelance math and science writer in beautiful Salt Lake City, Utah, where fall is just gorgeous and everyone who's on this recording, which means no one listening to it, gets to see this cute zoom background I have from this fall hike I did recently with this mountain goat, like, posing for me in the back. It kind of looks like a bodybuilder, honestly, like really beefy. But yeah, super cute mountain goat. So yeah, that really helpful for everyone at home. Here is our other host.
Kevin Knudson: I’m Kevin Knudson, a professor of mathematics at the University of Florida on the internet, they would call him an absolute unit, right?
EL: Definitely. At least they would have five years ago. Who knows these days?
KK: Shows how out of touch I am. That's right. Yeah, here we are. Yeah, it's actually lovely in Gainesville. Like I've got short sleeves on, but it's like 75 and sunny and just everything you want it to be.
EL: Perfect.
KK: And tomorrow, tomorrow's homecoming at the university, which means that it's closed — this is bizarre — for a parade. But as it happens, tomorrow is also my birthday.
EL: Wow!
KK: So I get the day off, and it’s unclear what I'm going to do yet.
EL: Well, just having a day off to lie in bed as long as he want, you know, drink your your coffee at a leisurely pace.
KK: Absolutely.
EL: It’ll be great. Yeah, and we are recording this shortly after Hurricane Ian. And so you're here, so you made it through okay. I actually don't know my Florida geography well enough to remember where Gainesville is.
KK: Gainesville is north central. And weirdly, this cold front sort of pushed just south of town right before. It was about 65 degrees for three or four days, which is freakish. The hurricane, of course, took its very destructive path entering around Fort Myers, went across over Orlando, then to the Atlantic side. We got about a half inch of rain. It was — I mean, we were expecting, like, 10 inches, and then that that weird path happened. Of course, a lot of a lot of our students, you know, their homes have just been devastated. It's a rough time, but you know, the governor and the President are at least putting aside their differences temporarily and making some good progress. Well, we'll see. It's gonna be a long rebuild down there.
EL: Yeah.
KK: And it's a beautiful part of the state, and I feel bad for everyone down there.
EL: Definitely.
KK: But yeah, it's not the first time, you know?
EL: Yeah. Well, yeah, I hope it it continues to progress on the cleanup and everything. And today, shifting gears entirely, we are very excited to have Kimberly Ayers on the show. Welcome, Kimberly, would you like to tell us a little bit about yourself?
Kimberly Ayers: Hi, thank you. Yeah, I'm super excited to be here. And happy early birthday, Kevin.
KK: Thanks.
KA: So I am an assistant professor in the math department at California State University San Marcos, which is about half an hour north of San Diego, so for those of you who are less familiar with California geography, and my research is in dynamical systems and ergodic theory.
KK: Cool.
EL: Nice. And I said “shifting gears” because I know you're also a biking enthusiast like I am.
KA: I am. Yes. I love to get out on my bike. And California weather is — San Diego weather, it’s hard to be outside. So I’m a big bike fan.
EL: Yeah, I — the other day, someone on a local social media thread was posting like, you know, “We shouldn't have good bike infrastructure in Salt Lake because, you know, we're not San Diego, so there's so little time that you can bike here.” And I was like, well, first of all, that's just not true. But you do live that dream of the, like, San Diego biking weather all year.
KA: Yeah, it’s — I can't complain about it.
KK: Sure. Well, it doesn't really snow that much in Salt Lake right. I mean, it hits the mountains. But yeah, I mean, so when I was a postdoc in Chicago, I cycled a lot, but come November I was finished, right?
EL: Yeah, because the roads just never get all the way clear, but here it's dry enough that they do get cleared. And so — you know, I am not an especially hardy person. But, you know, if you’ve got some layers on and the ice is off the road, it's actually doable.
KK: It’s not a problem.
EL: I discovered. I mean, this was a pandemic discovery because I grew up in Texas, and I would just put my bike away in, like, November here, but decided I mental health-wise that I really needed that during especially the height of that covid winter — our first covid winter. Anyway, lovely to have, I guess three people who enjoy biking on this show, but we are not here to talk about biking. We are here to talk about Kimberly's favorite theorem. So yeah, what is that?
KA: So my favorite theorem is a theorem called Sharkovskii's theorem, which is a pretty famous theorem in dynamics. To back up a little bit, when I talk about dynamics, right now I'm talking about discrete dynamical systems, where the idea is if you have a function that has the same domain and codomain, you can think about compositions of that function with itself, right? You can take successive compositions over and over again. And so as a dynamicist, I'm interested in looking at these sequences, like if I start with the point x in my domain, and then I apply f, so I get f(x), and then apply f again, get f(f(x)), and then f(f(f(x))), and so on and so forth, right? This is a sequence. And so we can ask questions about sequences, right? We can ask, like, do they converge? Or maybe if they don't converge, do they have a convergent subsequence? Do they ever repeat themselves, and that repeating themselves is actually what Sharkovskii’s theorem is all about. So Sharkovskii's theorem is about continuous functions on the real line. So it's important to say that there is at the moment, no, like higher dimensional-analog to Sharkovskii’s theorem. This only applies to one-dimensional functions. But Sharkovskii’s theorem says, okay, so we have to take a really weird ordering on the natural numbers. So we're going to start with all of the odd numbers except for 1. So starting at three, we'll take all the odd numbers in a row, so 3, 5, 7, 9, 11, so on and so forth. And then, once you're “done” with all the odd numbers, yes, then you'll consider 2×3, 2×5, 2×7. And again, all of those, right, and then 22×3, 22×5, taking higher powers of to multiplied by 3, 5, 7,.… And then again, once you're “done” with all of those, the only numbers that you haven't included are the powers of 2, so then you take all of the powers of 2 in descending order, until you get back to 1. So it's not a well-ordering, right? You have to kind of wrap your mind around this fact of like, once you're “done” with the odd numbers, then…
EL: Yeah, right.
KA: But it is a total ordering.
EL: Yeah, so you can take any two numbers, you can tell, like, which one is before the other one.
KA: Exactly.
KK: Right.
EL: But you can't say this one is 17th. Well, you can actually say which one is 17th in the series, but, like, a number that isn't an odd number, you can’t say what position it has.
KA: Exactly, exactly. If I were to count, like, the natural numbers in their usual ordering, you know that eventually, you're going to get— like, if you asked me are you eventually going to hit 571? Yes, I will. Right. But if I were to try to do this with the with Sharkovskii ordering, that’s not going to happen, right? So it's kind of weird that there's a minimal and a maximal element in the Sharkovskii ordering. So I haven't told you the punchline yet.
EL: Yeah, we’re just wrapping our head around.
KA: But yeah, exactly, right. So we start by taking this really strange ordering on the natural numbers. And then what Sharkovskii’s theorem says is if you have a period — so I guess I didn't quite define which way which the ordering goes, but let's say that 3 is the maximal element and 1 is the minimal element. So Sharkovskii’s theorem says that if you have a period N orbit for some discrete mapping on the real numbers, then for every M that's less than N, you also have a periodic orbit of that period. So, for instance, if you have a period 2 orbit, you have to have a period 1 orbit, which is what we just call a fixed point, right? If you have period 64, then you also have a period 32, 16, 8, 4, 2, 1, etcetera, and probably most excitingly, is that if you have a period 3 orbit, then you are guaranteed to have periodic orbits of any other period.
KK: Right.
KA: So sometimes people talk about Sharkovskii’s theorem, and what they say is period 3 implies chaos.
EL: Yeah.
KA: Now, I haven't told you what chaos is. And I kind of joke, actually, that chaos in the dynamics community is sort of a bit of a chaotic concept in and of itself. Because there is no really one universally accepted definition of chaos. There's several different types of chaos. There's what we call like Devaney chaos or Li–Yorke chaos. But this definition of chaos, which I believe is Li–Yorke, says that in order to have what we call chaos, you need periodic orbits of all periods. So there's that period 3 gives you that condition, you also need an orbit that is going to be dense in your space. So an orbit that kind of fills up your entire space. And then you need this other thing, which is probably the most famous aspect of chaos theory, which is the sensitive dependence on initial conditions, otherwise sometimes termed the butterfly effect, which basically says that if you have two points, no matter how close together, your initial points are, if you apply f enough times, their sequences, their orbits eventually grow some distance apart from each other, no matter how close together, they start. So there's essentially like no room for error if you have a chaotic system.
KK: Right, right.
KA: Sothat's why once you have period 3, you're guaranteed at least one of the requirements for a chaotic system, right? So my students asked me the other day, they were like, what's your favorite number? And I was like, Oh, that's a really hard question to answer.
KK: Three!
KA: But I think it has to be three because of this, like, you see a period 3 orbit, you kind of automatically get excited, because those are, like, pretty rare. So that is, yeah, I guess I have to say that three is my favorite number
EL: So we get a favorite number for free on this episode.
KA: A favorite number and a favorite theorem!
EL: Okay, so I want — I'm dragging a little bit this morning. And so having a little trouble with, you know, putting these things together. So what I mean, so we have this weird order on the natural numbers. Like sure, I'll let you do that. Can't really stop you. So what kind of — what is our f? What? What kinds of dynamical systems are we talking about here?
KA: Yeah, so let's talk about a couple of famous examples. So oftentimes, I'm talking, I've been talking about functions on the real line, but I guess really dynamicists really like studying functions on compact sets. Because if you have compactness, you're guaranteed things have convergent subsequences. So so we get what we call this limit behavior. So a lot of dynamicists study functions on, like, the closed unit interval. And so some examples are what's called the logistic map, which is a quadratic function. So it's some parameter R times x times 1−x. So it looks like an upside down parabola, right, it intersects the x-axis at 0 and 1. And then in order to make sure that you map back to things in between the 0 and 1 interval, we're going to require that R be between 0 and 4, right?
KK: Right.
KA: If R is bigger than 4, then the top of that parabola bumps up above 1, and, and there are actually cool things that you can study with that as well. But I'll talk about that another day.
KK: Oh, the bifurcation diagrams. That's what you want to talk about. Right?
KA: Yeah, once you vary that, so that's the cool thing, is like as you vary this parameter R, if you start at 0, and then kind of think about what happens is you increase R a little bit, you see these periodic orbits appear in exactly the order that Sharkovskii tells you, they're going to happen. So you start with a fixed point, so that's your period 1 orbit, right? And then I believe it as once R gets above 3, that's when that period 2 orbit shows up, right? And then if you bump up R a little bit more, then you see a period 4 orbit appear, and then a period 8, and we call this a period doubling cascade. And then, of course, as you can imagine, there's only finite room for R, right, we only consider up to 4, but there's infinitely many kinds of periodic orbits that have to appear, so they have to start coming at you faster and faster and faster. And then once R gets above — I want to say it's roughly 3.87, and it's only a number that we've sort of gotten that numerically, unfortunately — then you enter what we call the chaotic regime, which means that period 3 orbit shows up. And then we know we have out there somewhere periodic orbits of all periods. So people might be familiar with this bifurcation diagram, which shows kind of like, where the periodic orbits appear. And it starts off, like on the left hand side, looking pretty simple and smooth. But as you go to the right hand side, it gets more, like, fractally looking. And that's because, once again, those periodic orbits have to show up more and more and more quickly, right, as R increases in value. It’s very cool.
EL: You are just blowing my mind, that there isn't just, like, what I thought was going to happen here was like, we would only get these powers of 2 ones. And then you're going to tell me a different dynamical system later that would have — but like, because it makes sense, it somehow makes sense to be like, at 3, we get this, at, you know, 3 1/2, we get this, at 3 3/4, we get that and, like, the fact that, that like, we some at some point do shift to this period 3 place?
KA: Right!
EL: And we can't even know what the number is? I’m sorry!
KK: And these functions are so simple, right? It's just a quadratic. It's a tent map, basically. Right?
KA: I know! It blows my mind every time I think about it. Yeah, so okay, so you're seeing the beauty in this theorem now. Like, just how cool it is. And I guess I'm gonna, you know, shift gears, pun completely intended there, once again, because not only is the statement of the theorem really cool, but the proof of the theorem is really incredibly beautiful. And so I'm going to do my best to explain it without being able to draw anything. But let's talk specifically about the period 3 implies periodic orbits of every other period, that kind of statement right there. So let's suppose that you have a period 3 orbit, so you've got three points on the real line, call them a, b, and c, such that f(a)=b, f(b)=c, f(c)=a, so they go in this cycle, right? You can label — so that's going to partition, at least between those, into two different intervals, right? So let's say that you have a, b, and c arranged from left to right. And I'm going to, for purposes, it's going to be a little bit weird. But there's a reason I'm going to call I1, the interval between b and c, and I2 the interval between a and b. So what you see happening, then, is that because of continuity, that interval I1, when you when you think of what it maps to under f, it has to cover that entire interval of I1 and I2, right? And also I2 maps to something that at least contains I1. Right? And so you've got this, I'm gesturing a lot with my hands, which I realize nobody can see.
KK: We do this all the time.
EL: Just to slow it down a little.
KA: Yeah.
EL: So the I2 has to cover I1, it’s kind of obvious-ish if you've drawn this like I have.
KA: Right, because you think about where the endpoints go.
EL: So yeah. Can we slow down on this I1 thing?
KA: Yes. So remember, b goes to c and c goes back around to a. v EL: C goes to a. Okay.
KA: Right? So if you think about what I1 does, is it kind of flips upside down, and then maybe kind of stretches and comes back around, right? And so once again, I'm assuming that f is a continuous function, so you're guaranteed fixed points when you map an interval to itself, right if you have a continuous function. And so since I kind of have this structure, what I'm going to do is I'm going to draw a graph, a directed graph, where my two vertices are going to represent I1 and I2. And then the way that you can kind of visualize this is that I1 has a loop that maps, because you think about I1 kind of covers itself. Right? There's an edge that goes from I1 to I2, because we said that I1 covers I2, and then there's also an edge that goes from I2 to I1. Okay?
KK: So you’ve got a directed loop in this graph.
EL: Yeah, a lollipop.
KA: Exactly. And now the really cool thing is you can create closed paths of any length in that graph, right, just by starting at I2, going up to I1, and going around, I1 as many times as you need and then coming back to I2. Right? And that — traversing along that graph is essentially what tells you about what the periodic orbits are doing. There's some analysis here, when you think about intervals mapping to themselves and guaranteeing fixed points. And so we say a periodic orbit sort of follows this loop if it kind of travels between these intervals, and then comes back to itself.
EL: Yeah, so you've got like some other theorem kind of sitting in the background saying, like, some interval along this has to — or sorry, some point in this interval, has to do this precise thing.
KA: Exactly. And you can prove that via, I think it's a result of the intermediate value theorem, because your function is continuous.
KK: Brouwer fixed point theorem, essentially.
KA: Right, exactly. You're essentially applying that and then you're just walking along this graph, and all you need to do is make sure that you start and end at the same point, in order to get that periodic-ness. And so then the punch line is that once you have this graph, now I can create, you know, a path, a closed path, of any length that I want, just from the existence of that period 3 orbit.
KK: Yeah. Okay.
KA: And so I really love that,— you know, I'm not a graph theorist. For a long time, I was calling closed paths loops, and my graph theorist friends got really mad at me, told me I had to stop doing that. So now I'm like, okay, closed paths. But I love that it builds — you look at the structure of the graph, and you kind of just read it off from there. And so the proof of the entire Sharkovskii’s theorem basically constructs these graphs with these intervals, and looks at which intervals map to each other, and then you draw the edges as they are needed. And then you, again, you just read off, like, what length closed paths can you get from this. And it’s so cool.
EL: Yeah, there's something so appealing about this, because, you know, you see this, and you're like, oh, man, I'm going to have to find a fixed point. Okay. Is it going to be, like, 3/4 of b plus? Like five? But no, you just have to lose all of this.
KA: You don't need to find it. You just know it's there yet?
EL: You can almost forget the whole dynamical system!
KA: Exactly.
EL: The whole actually nitty gritty details of what's happening with this dynamical system and just say, like, oh, look, I made my little lollipop.
KA: You just take all that information, and you put it in the graph, and then you can kind of almost forget about the dynamical system and just look at the graph.
KK: Well, until you want to actually find this the orbit, right? I mean…
KA: Yeah, well, sure. And that is a much more challenging question, right? Because these orbits are often what we call unstable, which means that other orbits don't converge to them. And so in order to find them, you kind of have to start, like, exactly on top of them, right? And that's very hard to do.
KK: Right, because because of the chaos business?
KA: Exactly, exactly. There’s zero room for error, literally.
EL: And so was this a love at first sight kind of theorem for you?
KA: Oh, absolutely. Once I learned it in grad school, I was just really taken away with — I started by being really captivated by this ordering on the naturals, because I never thought to, like, rearrange the natural numbers — you know, the usual ordering seemed good enough to me. But like, oh, we can rearrange in this ordering, like, maybe is — sorry, my dog is groaning behind me. I hope that that's coming through. We can take this different ordering, and it still like, makes sense. I can still compare any two natural numbers. But it just completely like violates my intuition. And then I saw the proof and I was like, oh, that's just beautiful. Taking all of this information and reducing it down to this directed graph and then just reading off the graph is just so cool to me.
EL: Yeah, so I've kind of maybe opening Pandora's box here, but, so we just did the period 3 implies the all other periods kind of thing. So I — so is the proof, if you wanted to show that, like, period 7 implies period 9 or something, do you do the same kind of thing?
KA: It’s the same idea. Yeah, you draw again, you, you think about a period seven orbit, you label the intervals in a very specific way, and you have to be a bit careful about the way that you label the intervals to make sure which intervals are going to cover which ones. And then again, you draw the graph, and then you can just read things off the graph.
EL: Okay.
KA: So that's exactly how it works. And so there is kind of a generalized graph that has structure that's a little bit too difficult for me to explain right now. But, you know, we'll say, like, take an arbitrary natural number k, we can generally describe the structure of what that graph is going to look like, and then we can describe what the lengths of the closed paths are going to be.
KK: Right, right.
EL: Yeah. Okay. So cool. So, like I said earlier, well, before we started recording, I've heard this “period three applies chaos,” it’s such a great tagline and stuff. And I always nod, like, yeah, yeah, it totally implies chaos. You know, I have looked at this a little bit, but I'm really happy to get to know a little more about what this theorem of actually means so I don't have to just pretend I actually understand what’s going on anymore.
KA: Yeah, it is a really cool, like you said tagline to this theorem. It's very punchy, and succinct.
EL: Yes.
KK: Yeah. Very cool. So okay, now, I’ve got to know. So the other part of this podcast is, we ask our guests to pair their theorem with something. So what what pairs well with, with Sharkovskii’s theorem.
KA: Okay, so I gave this a lot of thought. And I think I came up with a really great pairing. I don't know if you have all ever watched taffy pulling videos?
KK: Absolutely.
EL: Yeah, I actually wrote an article.
KK: Right. I remember this.
EL: Yeah. Dynamical systems in taffy pulling at one point.
KA: Yeah, so I will say I don't personally particularly enjoy taffy. I think it's sweet and sticky and makes my teeth feel kind of weird. But I could watch videos of taffy being pulled for hours. So the idea, and I you know, I don't make candy. So I don't necessarily know what I'm talking about. But my understanding is, they have this big mass of sugar that they've boiled. And they need to basically aerate it, they need to introduce air into it somehow. And so you can do this either on a machine or by hand. But basically, the taffy has to be stretched and then folded back on itself and stretched and folded back. And they just do this over and over and over again. And I think about like, well, that's exactly what we're talking about doing with the unit interval when we talk about a continuous function on the unit interval. We're kind of deforming the unit interval somehow and then putting it back on top of itself. And then just doing that over and over and over and over again.
KK: Right? Yeah, so.
KA: Go ahead, Kevin.
KK: No, I was just going to say, this reminds me, so Evelyn, I think had this diagram of the taffy pulling machine in your article, right? It is really fascinating. That's how these sort of it's 3 things. Yeah. 3 shows up everywhere, right?
KA: Yeah, and now again, I don't know anything about this, but I wonder if that's intentional. And what's also really cool is sometimes they'll put, if they’re going to dye the Taffy a certain color, they put a little bit of coloring just somewhere on the taffy, and then as the taffy gets pulled, that color works its way through the entire thing. And I like to think of that as an analogue to the dense orbit: you start in a very concentrated little area, but slowly this dye works its way throughout the entire the entire mass of candy.
KK: No, I think that's actually an excellent analogy. That's exactly what's happening. And so I wonder if — the inventor didn't know this theorem, of course. But somehow taffy pullers intuitively knew that 3 would do the trick.
KA: Three is the one that would work.
KK: Two’s not enough to just going to do what we want, but just sort of flip it over itself, but three will really you know, yeah, braid it at least.
EL: Yeah, it's cool. It was so long ago at this point that I wrote that that I can't remember the punchline of my article. I might need to go find find my article and read it again. Be like, oh, that person really wrote so well! But yeah, that's fun, and actually as a kid I loved taffy. But as an adult with, you know, various tooth and jaw problems, it's not the very friendliest candy.
KA: No, it's not. But at least you can watch videos of taffy being pulled without having to actually eat it.
EL: That really is the fun part.
KA: And actually, there’s one more thing that I should tell you about Sharkovskii, that is very cool about it. There's this thing that's called the — maybe this is not quite correct what they call it, but they call it the converse Sharkovskii — which says that, so we have this ordering on the natural numbers. There is, for every tail of that ordering, there is going to be a continuous function that has exactly periods of those exact — periodic orbits of those exact periods.
EL: And, like, none before it.
KA: And none before it, every single one. So it's kind of like this sharp, you know, it goes both ways.
KK: So for every natural number, there is a dynamical system that has that many of that order, but then everything less than that in the Sharkovskii ordering, but nothing before.
KA: Exactly, but nothing that came before. So it goes both ways, in a way, right.
KK: So there's something with an orbit of order 57, but not 55.
KA: Exactly. Exactly.
KK: All right. Interesting.
KA: So I think that's also a very cool result.
KK: Dynamics is hard!
KA: Yeah, it is. And I feel like I'm constantly having to, like, I feel like my spatial sense is never very good. But maybe it's gotten better over the course of me studying dynamics more. I’m, like, constantly having to turn things around in my brain and, like, fold things over. And yeah, I love it a lot.
KK: So we also like to give our guests a chance to plug anything. Where can we find you on the worldwide intertubes?
KA: So I am on Twitter. My handle is @kimdayers. Some other cool projects going on is I've gotten I've been doing a fair amount with LGBTQ people in STEM. And so I did an interview over the summer talking to this organization called LGBT Tech, and you can find that on YouTube, just talking about my experience as a mathematician and my identity as a queer woman. And so that's something that I'm very passionate about. And you know, if anybody ever wants to talk more about that, they're welcome to reach out to me on Twitter.
EL: Nice.
KK: Excellent.
EL: Yeah, we'll include a link to that in the show notes. Check those out. Yeah. Thanks for joining us.
KK: Yeah, this has been great. Really great.
KA: Thank you so much. This was a lot of fun.
KK: Good. Take care.
[outro]
On this episode, we were happy to have Kimberly Ayers of California State University San Marcos on the podcast to talk about Sharkovskii's theorem. Here are some links you might enjoy perusing after you listen to the episode.
Ayers' website and Twitter account
Her interview with LGBT Tech
Tien-Yien Li and James A. Yorke's article Period Three Implies Chaos
Our "flash favorite theorem" episode, where Michelle Manes also professed her love of Sharkovskii's theorem
Evelyn's Smithsonian article about the mathematics of taffy pullers
94 episoder
Manage episode 344857112 series 1516226
Evelyn Lamb: Hello and welcome to My Favorite Theorem, the math podcast with no quiz at the end. My name is Evelyn Lamb. I'm a freelance math and science writer in beautiful Salt Lake City, Utah, where fall is just gorgeous and everyone who's on this recording, which means no one listening to it, gets to see this cute zoom background I have from this fall hike I did recently with this mountain goat, like, posing for me in the back. It kind of looks like a bodybuilder, honestly, like really beefy. But yeah, super cute mountain goat. So yeah, that really helpful for everyone at home. Here is our other host.
Kevin Knudson: I’m Kevin Knudson, a professor of mathematics at the University of Florida on the internet, they would call him an absolute unit, right?
EL: Definitely. At least they would have five years ago. Who knows these days?
KK: Shows how out of touch I am. That's right. Yeah, here we are. Yeah, it's actually lovely in Gainesville. Like I've got short sleeves on, but it's like 75 and sunny and just everything you want it to be.
EL: Perfect.
KK: And tomorrow, tomorrow's homecoming at the university, which means that it's closed — this is bizarre — for a parade. But as it happens, tomorrow is also my birthday.
EL: Wow!
KK: So I get the day off, and it’s unclear what I'm going to do yet.
EL: Well, just having a day off to lie in bed as long as he want, you know, drink your your coffee at a leisurely pace.
KK: Absolutely.
EL: It’ll be great. Yeah, and we are recording this shortly after Hurricane Ian. And so you're here, so you made it through okay. I actually don't know my Florida geography well enough to remember where Gainesville is.
KK: Gainesville is north central. And weirdly, this cold front sort of pushed just south of town right before. It was about 65 degrees for three or four days, which is freakish. The hurricane, of course, took its very destructive path entering around Fort Myers, went across over Orlando, then to the Atlantic side. We got about a half inch of rain. It was — I mean, we were expecting, like, 10 inches, and then that that weird path happened. Of course, a lot of a lot of our students, you know, their homes have just been devastated. It's a rough time, but you know, the governor and the President are at least putting aside their differences temporarily and making some good progress. Well, we'll see. It's gonna be a long rebuild down there.
EL: Yeah.
KK: And it's a beautiful part of the state, and I feel bad for everyone down there.
EL: Definitely.
KK: But yeah, it's not the first time, you know?
EL: Yeah. Well, yeah, I hope it it continues to progress on the cleanup and everything. And today, shifting gears entirely, we are very excited to have Kimberly Ayers on the show. Welcome, Kimberly, would you like to tell us a little bit about yourself?
Kimberly Ayers: Hi, thank you. Yeah, I'm super excited to be here. And happy early birthday, Kevin.
KK: Thanks.
KA: So I am an assistant professor in the math department at California State University San Marcos, which is about half an hour north of San Diego, so for those of you who are less familiar with California geography, and my research is in dynamical systems and ergodic theory.
KK: Cool.
EL: Nice. And I said “shifting gears” because I know you're also a biking enthusiast like I am.
KA: I am. Yes. I love to get out on my bike. And California weather is — San Diego weather, it’s hard to be outside. So I’m a big bike fan.
EL: Yeah, I — the other day, someone on a local social media thread was posting like, you know, “We shouldn't have good bike infrastructure in Salt Lake because, you know, we're not San Diego, so there's so little time that you can bike here.” And I was like, well, first of all, that's just not true. But you do live that dream of the, like, San Diego biking weather all year.
KA: Yeah, it’s — I can't complain about it.
KK: Sure. Well, it doesn't really snow that much in Salt Lake right. I mean, it hits the mountains. But yeah, I mean, so when I was a postdoc in Chicago, I cycled a lot, but come November I was finished, right?
EL: Yeah, because the roads just never get all the way clear, but here it's dry enough that they do get cleared. And so — you know, I am not an especially hardy person. But, you know, if you’ve got some layers on and the ice is off the road, it's actually doable.
KK: It’s not a problem.
EL: I discovered. I mean, this was a pandemic discovery because I grew up in Texas, and I would just put my bike away in, like, November here, but decided I mental health-wise that I really needed that during especially the height of that covid winter — our first covid winter. Anyway, lovely to have, I guess three people who enjoy biking on this show, but we are not here to talk about biking. We are here to talk about Kimberly's favorite theorem. So yeah, what is that?
KA: So my favorite theorem is a theorem called Sharkovskii's theorem, which is a pretty famous theorem in dynamics. To back up a little bit, when I talk about dynamics, right now I'm talking about discrete dynamical systems, where the idea is if you have a function that has the same domain and codomain, you can think about compositions of that function with itself, right? You can take successive compositions over and over again. And so as a dynamicist, I'm interested in looking at these sequences, like if I start with the point x in my domain, and then I apply f, so I get f(x), and then apply f again, get f(f(x)), and then f(f(f(x))), and so on and so forth, right? This is a sequence. And so we can ask questions about sequences, right? We can ask, like, do they converge? Or maybe if they don't converge, do they have a convergent subsequence? Do they ever repeat themselves, and that repeating themselves is actually what Sharkovskii’s theorem is all about. So Sharkovskii's theorem is about continuous functions on the real line. So it's important to say that there is at the moment, no, like higher dimensional-analog to Sharkovskii’s theorem. This only applies to one-dimensional functions. But Sharkovskii’s theorem says, okay, so we have to take a really weird ordering on the natural numbers. So we're going to start with all of the odd numbers except for 1. So starting at three, we'll take all the odd numbers in a row, so 3, 5, 7, 9, 11, so on and so forth. And then, once you're “done” with all the odd numbers, yes, then you'll consider 2×3, 2×5, 2×7. And again, all of those, right, and then 22×3, 22×5, taking higher powers of to multiplied by 3, 5, 7,.… And then again, once you're “done” with all of those, the only numbers that you haven't included are the powers of 2, so then you take all of the powers of 2 in descending order, until you get back to 1. So it's not a well-ordering, right? You have to kind of wrap your mind around this fact of like, once you're “done” with the odd numbers, then…
EL: Yeah, right.
KA: But it is a total ordering.
EL: Yeah, so you can take any two numbers, you can tell, like, which one is before the other one.
KA: Exactly.
KK: Right.
EL: But you can't say this one is 17th. Well, you can actually say which one is 17th in the series, but, like, a number that isn't an odd number, you can’t say what position it has.
KA: Exactly, exactly. If I were to count, like, the natural numbers in their usual ordering, you know that eventually, you're going to get— like, if you asked me are you eventually going to hit 571? Yes, I will. Right. But if I were to try to do this with the with Sharkovskii ordering, that’s not going to happen, right? So it's kind of weird that there's a minimal and a maximal element in the Sharkovskii ordering. So I haven't told you the punchline yet.
EL: Yeah, we’re just wrapping our head around.
KA: But yeah, exactly, right. So we start by taking this really strange ordering on the natural numbers. And then what Sharkovskii’s theorem says is if you have a period — so I guess I didn't quite define which way which the ordering goes, but let's say that 3 is the maximal element and 1 is the minimal element. So Sharkovskii’s theorem says that if you have a period N orbit for some discrete mapping on the real numbers, then for every M that's less than N, you also have a periodic orbit of that period. So, for instance, if you have a period 2 orbit, you have to have a period 1 orbit, which is what we just call a fixed point, right? If you have period 64, then you also have a period 32, 16, 8, 4, 2, 1, etcetera, and probably most excitingly, is that if you have a period 3 orbit, then you are guaranteed to have periodic orbits of any other period.
KK: Right.
KA: So sometimes people talk about Sharkovskii’s theorem, and what they say is period 3 implies chaos.
EL: Yeah.
KA: Now, I haven't told you what chaos is. And I kind of joke, actually, that chaos in the dynamics community is sort of a bit of a chaotic concept in and of itself. Because there is no really one universally accepted definition of chaos. There's several different types of chaos. There's what we call like Devaney chaos or Li–Yorke chaos. But this definition of chaos, which I believe is Li–Yorke, says that in order to have what we call chaos, you need periodic orbits of all periods. So there's that period 3 gives you that condition, you also need an orbit that is going to be dense in your space. So an orbit that kind of fills up your entire space. And then you need this other thing, which is probably the most famous aspect of chaos theory, which is the sensitive dependence on initial conditions, otherwise sometimes termed the butterfly effect, which basically says that if you have two points, no matter how close together, your initial points are, if you apply f enough times, their sequences, their orbits eventually grow some distance apart from each other, no matter how close together, they start. So there's essentially like no room for error if you have a chaotic system.
KK: Right, right.
KA: Sothat's why once you have period 3, you're guaranteed at least one of the requirements for a chaotic system, right? So my students asked me the other day, they were like, what's your favorite number? And I was like, Oh, that's a really hard question to answer.
KK: Three!
KA: But I think it has to be three because of this, like, you see a period 3 orbit, you kind of automatically get excited, because those are, like, pretty rare. So that is, yeah, I guess I have to say that three is my favorite number
EL: So we get a favorite number for free on this episode.
KA: A favorite number and a favorite theorem!
EL: Okay, so I want — I'm dragging a little bit this morning. And so having a little trouble with, you know, putting these things together. So what I mean, so we have this weird order on the natural numbers. Like sure, I'll let you do that. Can't really stop you. So what kind of — what is our f? What? What kinds of dynamical systems are we talking about here?
KA: Yeah, so let's talk about a couple of famous examples. So oftentimes, I'm talking, I've been talking about functions on the real line, but I guess really dynamicists really like studying functions on compact sets. Because if you have compactness, you're guaranteed things have convergent subsequences. So so we get what we call this limit behavior. So a lot of dynamicists study functions on, like, the closed unit interval. And so some examples are what's called the logistic map, which is a quadratic function. So it's some parameter R times x times 1−x. So it looks like an upside down parabola, right, it intersects the x-axis at 0 and 1. And then in order to make sure that you map back to things in between the 0 and 1 interval, we're going to require that R be between 0 and 4, right?
KK: Right.
KA: If R is bigger than 4, then the top of that parabola bumps up above 1, and, and there are actually cool things that you can study with that as well. But I'll talk about that another day.
KK: Oh, the bifurcation diagrams. That's what you want to talk about. Right?
KA: Yeah, once you vary that, so that's the cool thing, is like as you vary this parameter R, if you start at 0, and then kind of think about what happens is you increase R a little bit, you see these periodic orbits appear in exactly the order that Sharkovskii tells you, they're going to happen. So you start with a fixed point, so that's your period 1 orbit, right? And then I believe it as once R gets above 3, that's when that period 2 orbit shows up, right? And then if you bump up R a little bit more, then you see a period 4 orbit appear, and then a period 8, and we call this a period doubling cascade. And then, of course, as you can imagine, there's only finite room for R, right, we only consider up to 4, but there's infinitely many kinds of periodic orbits that have to appear, so they have to start coming at you faster and faster and faster. And then once R gets above — I want to say it's roughly 3.87, and it's only a number that we've sort of gotten that numerically, unfortunately — then you enter what we call the chaotic regime, which means that period 3 orbit shows up. And then we know we have out there somewhere periodic orbits of all periods. So people might be familiar with this bifurcation diagram, which shows kind of like, where the periodic orbits appear. And it starts off, like on the left hand side, looking pretty simple and smooth. But as you go to the right hand side, it gets more, like, fractally looking. And that's because, once again, those periodic orbits have to show up more and more and more quickly, right, as R increases in value. It’s very cool.
EL: You are just blowing my mind, that there isn't just, like, what I thought was going to happen here was like, we would only get these powers of 2 ones. And then you're going to tell me a different dynamical system later that would have — but like, because it makes sense, it somehow makes sense to be like, at 3, we get this, at, you know, 3 1/2, we get this, at 3 3/4, we get that and, like, the fact that, that like, we some at some point do shift to this period 3 place?
KA: Right!
EL: And we can't even know what the number is? I’m sorry!
KK: And these functions are so simple, right? It's just a quadratic. It's a tent map, basically. Right?
KA: I know! It blows my mind every time I think about it. Yeah, so okay, so you're seeing the beauty in this theorem now. Like, just how cool it is. And I guess I'm gonna, you know, shift gears, pun completely intended there, once again, because not only is the statement of the theorem really cool, but the proof of the theorem is really incredibly beautiful. And so I'm going to do my best to explain it without being able to draw anything. But let's talk specifically about the period 3 implies periodic orbits of every other period, that kind of statement right there. So let's suppose that you have a period 3 orbit, so you've got three points on the real line, call them a, b, and c, such that f(a)=b, f(b)=c, f(c)=a, so they go in this cycle, right? You can label — so that's going to partition, at least between those, into two different intervals, right? So let's say that you have a, b, and c arranged from left to right. And I'm going to, for purposes, it's going to be a little bit weird. But there's a reason I'm going to call I1, the interval between b and c, and I2 the interval between a and b. So what you see happening, then, is that because of continuity, that interval I1, when you when you think of what it maps to under f, it has to cover that entire interval of I1 and I2, right? And also I2 maps to something that at least contains I1. Right? And so you've got this, I'm gesturing a lot with my hands, which I realize nobody can see.
KK: We do this all the time.
EL: Just to slow it down a little.
KA: Yeah.
EL: So the I2 has to cover I1, it’s kind of obvious-ish if you've drawn this like I have.
KA: Right, because you think about where the endpoints go.
EL: So yeah. Can we slow down on this I1 thing?
KA: Yes. So remember, b goes to c and c goes back around to a. v EL: C goes to a. Okay.
KA: Right? So if you think about what I1 does, is it kind of flips upside down, and then maybe kind of stretches and comes back around, right? And so once again, I'm assuming that f is a continuous function, so you're guaranteed fixed points when you map an interval to itself, right if you have a continuous function. And so since I kind of have this structure, what I'm going to do is I'm going to draw a graph, a directed graph, where my two vertices are going to represent I1 and I2. And then the way that you can kind of visualize this is that I1 has a loop that maps, because you think about I1 kind of covers itself. Right? There's an edge that goes from I1 to I2, because we said that I1 covers I2, and then there's also an edge that goes from I2 to I1. Okay?
KK: So you’ve got a directed loop in this graph.
EL: Yeah, a lollipop.
KA: Exactly. And now the really cool thing is you can create closed paths of any length in that graph, right, just by starting at I2, going up to I1, and going around, I1 as many times as you need and then coming back to I2. Right? And that — traversing along that graph is essentially what tells you about what the periodic orbits are doing. There's some analysis here, when you think about intervals mapping to themselves and guaranteeing fixed points. And so we say a periodic orbit sort of follows this loop if it kind of travels between these intervals, and then comes back to itself.
EL: Yeah, so you've got like some other theorem kind of sitting in the background saying, like, some interval along this has to — or sorry, some point in this interval, has to do this precise thing.
KA: Exactly. And you can prove that via, I think it's a result of the intermediate value theorem, because your function is continuous.
KK: Brouwer fixed point theorem, essentially.
KA: Right, exactly. You're essentially applying that and then you're just walking along this graph, and all you need to do is make sure that you start and end at the same point, in order to get that periodic-ness. And so then the punch line is that once you have this graph, now I can create, you know, a path, a closed path, of any length that I want, just from the existence of that period 3 orbit.
KK: Yeah. Okay.
KA: And so I really love that,— you know, I'm not a graph theorist. For a long time, I was calling closed paths loops, and my graph theorist friends got really mad at me, told me I had to stop doing that. So now I'm like, okay, closed paths. But I love that it builds — you look at the structure of the graph, and you kind of just read it off from there. And so the proof of the entire Sharkovskii’s theorem basically constructs these graphs with these intervals, and looks at which intervals map to each other, and then you draw the edges as they are needed. And then you, again, you just read off, like, what length closed paths can you get from this. And it’s so cool.
EL: Yeah, there's something so appealing about this, because, you know, you see this, and you're like, oh, man, I'm going to have to find a fixed point. Okay. Is it going to be, like, 3/4 of b plus? Like five? But no, you just have to lose all of this.
KA: You don't need to find it. You just know it's there yet?
EL: You can almost forget the whole dynamical system!
KA: Exactly.
EL: The whole actually nitty gritty details of what's happening with this dynamical system and just say, like, oh, look, I made my little lollipop.
KA: You just take all that information, and you put it in the graph, and then you can kind of almost forget about the dynamical system and just look at the graph.
KK: Well, until you want to actually find this the orbit, right? I mean…
KA: Yeah, well, sure. And that is a much more challenging question, right? Because these orbits are often what we call unstable, which means that other orbits don't converge to them. And so in order to find them, you kind of have to start, like, exactly on top of them, right? And that's very hard to do.
KK: Right, because because of the chaos business?
KA: Exactly, exactly. There’s zero room for error, literally.
EL: And so was this a love at first sight kind of theorem for you?
KA: Oh, absolutely. Once I learned it in grad school, I was just really taken away with — I started by being really captivated by this ordering on the naturals, because I never thought to, like, rearrange the natural numbers — you know, the usual ordering seemed good enough to me. But like, oh, we can rearrange in this ordering, like, maybe is — sorry, my dog is groaning behind me. I hope that that's coming through. We can take this different ordering, and it still like, makes sense. I can still compare any two natural numbers. But it just completely like violates my intuition. And then I saw the proof and I was like, oh, that's just beautiful. Taking all of this information and reducing it down to this directed graph and then just reading off the graph is just so cool to me.
EL: Yeah, so I've kind of maybe opening Pandora's box here, but, so we just did the period 3 implies the all other periods kind of thing. So I — so is the proof, if you wanted to show that, like, period 7 implies period 9 or something, do you do the same kind of thing?
KA: It’s the same idea. Yeah, you draw again, you, you think about a period seven orbit, you label the intervals in a very specific way, and you have to be a bit careful about the way that you label the intervals to make sure which intervals are going to cover which ones. And then again, you draw the graph, and then you can just read things off the graph.
EL: Okay.
KA: So that's exactly how it works. And so there is kind of a generalized graph that has structure that's a little bit too difficult for me to explain right now. But, you know, we'll say, like, take an arbitrary natural number k, we can generally describe the structure of what that graph is going to look like, and then we can describe what the lengths of the closed paths are going to be.
KK: Right, right.
EL: Yeah. Okay. So cool. So, like I said earlier, well, before we started recording, I've heard this “period three applies chaos,” it’s such a great tagline and stuff. And I always nod, like, yeah, yeah, it totally implies chaos. You know, I have looked at this a little bit, but I'm really happy to get to know a little more about what this theorem of actually means so I don't have to just pretend I actually understand what’s going on anymore.
KA: Yeah, it is a really cool, like you said tagline to this theorem. It's very punchy, and succinct.
EL: Yes.
KK: Yeah. Very cool. So okay, now, I’ve got to know. So the other part of this podcast is, we ask our guests to pair their theorem with something. So what what pairs well with, with Sharkovskii’s theorem.
KA: Okay, so I gave this a lot of thought. And I think I came up with a really great pairing. I don't know if you have all ever watched taffy pulling videos?
KK: Absolutely.
EL: Yeah, I actually wrote an article.
KK: Right. I remember this.
EL: Yeah. Dynamical systems in taffy pulling at one point.
KA: Yeah, so I will say I don't personally particularly enjoy taffy. I think it's sweet and sticky and makes my teeth feel kind of weird. But I could watch videos of taffy being pulled for hours. So the idea, and I you know, I don't make candy. So I don't necessarily know what I'm talking about. But my understanding is, they have this big mass of sugar that they've boiled. And they need to basically aerate it, they need to introduce air into it somehow. And so you can do this either on a machine or by hand. But basically, the taffy has to be stretched and then folded back on itself and stretched and folded back. And they just do this over and over and over again. And I think about like, well, that's exactly what we're talking about doing with the unit interval when we talk about a continuous function on the unit interval. We're kind of deforming the unit interval somehow and then putting it back on top of itself. And then just doing that over and over and over and over again.
KK: Right? Yeah, so.
KA: Go ahead, Kevin.
KK: No, I was just going to say, this reminds me, so Evelyn, I think had this diagram of the taffy pulling machine in your article, right? It is really fascinating. That's how these sort of it's 3 things. Yeah. 3 shows up everywhere, right?
KA: Yeah, and now again, I don't know anything about this, but I wonder if that's intentional. And what's also really cool is sometimes they'll put, if they’re going to dye the Taffy a certain color, they put a little bit of coloring just somewhere on the taffy, and then as the taffy gets pulled, that color works its way through the entire thing. And I like to think of that as an analogue to the dense orbit: you start in a very concentrated little area, but slowly this dye works its way throughout the entire the entire mass of candy.
KK: No, I think that's actually an excellent analogy. That's exactly what's happening. And so I wonder if — the inventor didn't know this theorem, of course. But somehow taffy pullers intuitively knew that 3 would do the trick.
KA: Three is the one that would work.
KK: Two’s not enough to just going to do what we want, but just sort of flip it over itself, but three will really you know, yeah, braid it at least.
EL: Yeah, it's cool. It was so long ago at this point that I wrote that that I can't remember the punchline of my article. I might need to go find find my article and read it again. Be like, oh, that person really wrote so well! But yeah, that's fun, and actually as a kid I loved taffy. But as an adult with, you know, various tooth and jaw problems, it's not the very friendliest candy.
KA: No, it's not. But at least you can watch videos of taffy being pulled without having to actually eat it.
EL: That really is the fun part.
KA: And actually, there’s one more thing that I should tell you about Sharkovskii, that is very cool about it. There's this thing that's called the — maybe this is not quite correct what they call it, but they call it the converse Sharkovskii — which says that, so we have this ordering on the natural numbers. There is, for every tail of that ordering, there is going to be a continuous function that has exactly periods of those exact — periodic orbits of those exact periods.
EL: And, like, none before it.
KA: And none before it, every single one. So it's kind of like this sharp, you know, it goes both ways.
KK: So for every natural number, there is a dynamical system that has that many of that order, but then everything less than that in the Sharkovskii ordering, but nothing before.
KA: Exactly, but nothing that came before. So it goes both ways, in a way, right.
KK: So there's something with an orbit of order 57, but not 55.
KA: Exactly. Exactly.
KK: All right. Interesting.
KA: So I think that's also a very cool result.
KK: Dynamics is hard!
KA: Yeah, it is. And I feel like I'm constantly having to, like, I feel like my spatial sense is never very good. But maybe it's gotten better over the course of me studying dynamics more. I’m, like, constantly having to turn things around in my brain and, like, fold things over. And yeah, I love it a lot.
KK: So we also like to give our guests a chance to plug anything. Where can we find you on the worldwide intertubes?
KA: So I am on Twitter. My handle is @kimdayers. Some other cool projects going on is I've gotten I've been doing a fair amount with LGBTQ people in STEM. And so I did an interview over the summer talking to this organization called LGBT Tech, and you can find that on YouTube, just talking about my experience as a mathematician and my identity as a queer woman. And so that's something that I'm very passionate about. And you know, if anybody ever wants to talk more about that, they're welcome to reach out to me on Twitter.
EL: Nice.
KK: Excellent.
EL: Yeah, we'll include a link to that in the show notes. Check those out. Yeah. Thanks for joining us.
KK: Yeah, this has been great. Really great.
KA: Thank you so much. This was a lot of fun.
KK: Good. Take care.
[outro]
On this episode, we were happy to have Kimberly Ayers of California State University San Marcos on the podcast to talk about Sharkovskii's theorem. Here are some links you might enjoy perusing after you listen to the episode.
Ayers' website and Twitter account
Her interview with LGBT Tech
Tien-Yien Li and James A. Yorke's article Period Three Implies Chaos
Our "flash favorite theorem" episode, where Michelle Manes also professed her love of Sharkovskii's theorem
Evelyn's Smithsonian article about the mathematics of taffy pullers
94 episoder
Semua episod
×Välkommen till Player FM
Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.