Episode 68: How to Train Your Algorithm

Episode 68: How to Train Your Algorithm
Collective Perspective Podcast
Episode 68: How to Train Your Algorithm

Jan 28 2026 | 00:28:41

/
Episode 68 January 28, 2026 00:28:41

Hosted By

Travis Eadens Jeff Aldrich DJ Malone (Season 1)

Show Notes

Reclaiming Your Mind in the Age of Distraction

in Episode 68 of The Collective Perspective, Jeff Aldrich and Travis Eadens dive into the hidden world of algorithms not just the ones on your phone, but the ones shaping your thoughts, habits, and daily decisions.

We talk about how to retrain your digital feed, protect your attention, and take back ownership of what influences you. From social media to streaming platforms, your algorithm learns from you but it’s time to teach it what really matters.

This conversation challenges the idea that we’re just passive consumers. Instead, we explore practical ways to become intentional curators of our digital lives, aligning what we see and hear with who we want to become.

Topics Covered:
• The psychology behind digital algorithms
• How social media shapes emotion and focus
• Why your attention is a form of currency
• Practical steps to reset your feed
• The connection between what you watch and who you are

The Collective Perspective Podcast where faith, logic, and community meet.
Helping Americans rise above division and distraction, one honest conversation at a time.

Subscribe for new episodes every week
Follow us on Instagram: @thecollectiveperspectivepodcast
Listen on Spotify, Apple Podcasts, or wherever you stream

Chapters

  • (00:00:00) - Introduction: The Power of Real People
  • (00:00:37) - How to Train Your Algorithm
  • (00:01:41) - The Intricacies of Algorithms
  • (00:05:33) - Practical Tips for Controlling Your Algorithm
  • (00:09:09) - The Social Media Experiment
  • (00:09:56) - The Impact of Social Media on Mental Health
  • (00:16:04) - Strategies for a Healthier Algorithm
  • (00:27:20) - Conclusion: Finding Common Ground
View Full Transcript

Episode Transcript

 This isn't just a podcast, it's a reminder. A reminder of what makes America Stronger isn't the headline or a hashtag. It's people from the ones building our homes to the ones rebuilding their lives, veterans, tradesmen, neighbors and volunteers, real people doing real things. Here we find common ground first, and then we work on our differences. This is the collective perspective podcast where purpose, people, and progress mean. Hey everybody. Welcome back to the Collective Perspective podcast. My name is Jeff and I'm here with my buddy Trav. What's up, Trav? Hey everybody. Welcome back. Glad to be here. Let's get this show on the road. Yeah, this is gonna be a fun one. Um, I've been throwing down around this idea for a while, but you've heard of How to Train Your Dragon. Uh, well think of it as how to train your algorithm. That's a pretty good catch there. Thanks. Thanks. You gotta have those key words, you know. Every once in a while anyway. So what I wanted to talk about is how to train your algorithm. And first of all, I wanted to say, have you, have you ever, my wife complains all the time about thinking of something. She doesn't even write it or type it, or it never leaves her brain, but it shows up in her algorithm. Uh, yeah. It seems like some sort of ESPN type of thing. ESPN Nah. Like ESP or something. Like It's reading your mind. I got a joke, but Good. Make you chuckle a little. Uh, yeah, it does seems, it seems kind of weird that you think of something. So, so in thinking that I, I realized that the algorithm shapes to, I don't know, I had this thought. A lot of what your algorithm has on it says a lot about that person. Yeah, it does say, it can say a lot about that person. Um, but it. It's also, you know, I, I think that it, uh, it kind of trains in on when you pause on something just so you can get an understanding of what's being said, and then it may think that you're interested in that. I think it goes beyond social media and your phone is doing most of the tracking. Well, and you know, for example, I could be looking at certain things on Google or uh, on Chrome. Or whatever name you wanna put it, safari. And then it will show up on my social media feeds. Well, I think you said it in one of our previous episodes, um, quite a while ago though. If you're not paying for the product, you are the product. So how much of these web browsers and apps that we have on our phone that we don't pay for, so anytime we type anything into it. That's the product, that's our, our searches. So, but how, how things tend to pop up without even searching it on your phone is kind of interesting. Oh, well, I guess it doesn't necessarily need to be your phone, but whatever account you're logged into, even your computer, they gotta make their money somewhere. So that's, that's what they do. They target you with ads, uh, but even further, they target you with content. Yeah, and I've noticed that if I think I was talking to you about I can every, I was going down Facebook and I can scroll and. I wouldn't even see a post from anybody I knew unless it was like, it was like five, six, sometimes seven different advertisements in between each post or, uh, different content creators in between each post. Uh, before you found something that was from someone you know or personally follow, maybe. I don't know if that's, is that that's accurate. Somebody you know would be about seven on your main feed. Obviously if you go to your friend's feed, that's all you're gonna see is your friends with no ads. But honestly, I feel that a part of the algorithm feeds division I, I don't know if that's something that it was the way it was created or if it's something within myself or someone else that. Draws to de, I mean, I like to debate, right? But trying, and I've realized that trying to debate anyone online, social media, not in person or live, it's just it doesn't work because then it became, you know, it comes up to opening to interpretation and whether you understand what they're even saying. Some of 'em don't use. Proper grammar. And so that's hard to decipher too sometimes. Right. Um, so what, what, what have you been thinking about algorithms lately, though? Uh, there's a reason for this podcast and let's see what it, what it is we're talking about. I don't know if everybody's privy to it, but you can. I believe that you can train your own algorithm. I, I, and I think I, I do agree with that. I think we can, and it's, uh, it, it takes some work and it takes some discipline. And this isn't the first video or first conversation about how to train your algorithm, but when it comes to training, it, it, there's some things to consider. So one, uh, turn off the, all the microphones in your app. On all apps go to the, go to the app itself in your profile for all your apps, and there should be a selection for turning on and off. So go into your, uh, settings, uh, tab on on whatever phone or tablet, phone, tablet or whatever, right? And turn off and go to that individual app within your settings and turn off the microphone. A lot of them too, a lot of websites, especially Facebook has a term where they are allowed to. Provide your information to third party A affiliates. I think that's part of that. Selling your information, that's where they get their money from. Well, believe it or not, there's a lot of that. And even on your iPhone, if you have an iPhone, you can turn that off apparently. I don't necessarily believe it all, but I think I did that on some of the apps that I have. I turn off the microphone, uh, or third party, um, giving information to third parties and also, uh. Only using the microphone when the app is actually in use. So it's important that if you close out the app to actually close the app, don't just swipe it away off the screen. You have to physically swipe it off and turn it off so that it, uh, shuts that feature off. That's a good point. I also use different apps for different things, so I don't necessarily. Although I do look for helpful things on YouTube, a lot of it's politics. And then on Instagram, I am typically looking about food or music of some sort, and especially if you have, okay, if you have your profile and then you create another profile and it's main content is, you know, audio stuff, for example, microphones and speakers and everything, your algorithms saying, okay, this is what he has interest in. Really, so it, it, it kind of combines the two, um, profiles as far as content that it delivers? Yes. Wow. And I have, uh, probably four different profiles on Instagram. So you're seeing the same thing over and over just on different profiles. Uh, I only stay on one profile usually. And I'll go and post things on other profiles. Okay? So you are only getting the content from one, and that's where it's being trained to what you like and everything. And then you post something else elsewhere. Uh, but even that is taken into account in some instances, isn't it? The things that you post is, are, are part of what they, uh, see as an interest? Yes. Uh, I've posted on Facebook, for example, I post a lot about food, but it gets me stuck on the politics side too because I seem to find myself arguing with people. Which I think has just created Division I, I could see a post from dude five days ago, or go on my friend's page and look, and they're not showing me all their posts, they're just showing me the posts that I would oppose. Call me crazy, but I'm not. It's true. So in order to train your algorithm, uh. We decided to do a little experiment and that was, well, okay, maybe we did decide to do a little experiment. I, I said, you know what? I am so sick and tired of Facebook and I was kind of forced or peer pressured into it. I. Well, the interesting thing was is that you didn't want to give it up. Uh, no. You think you're, you think that is, you're absolutely right. Um, and it's that dopamine hit man. It's, it's, um, it was more about feeling in the dark about something. Uh, fear of missing out. Fomo. They make commercials about fomo. Fomo, yeah. That's what they call it. Fear of missing out. Oh, fear of missing out. Okay. Well, I guess that there is some upside to Facebook and, and that is seeing, you know, just keeping up with friends. You can't possibly, unless you talk to someone. And you don't go on social media. You don't know what's going on in their day, but they usually give a glimpse of something that's going on or maybe something they're pretentious about, but, you know, a little a, a, a family trip or, um, just a little excursion or going out exploring and doing something different because, you know. Uh, me, just like anybody else, I, I'm not gonna post my day-to-day struggles on social media. You're only gonna see the rosy side of things mostly. Um, so yeah, I, I think, uh, seeing that uplifting part of somebody else's journey is, is what we're, what we like to see. And it, gi I think it, i, it gives me hope. I think you can go in a pretty dark place if you start posting all of the things you are going wrong in your day and you don't stay positive. Yeah, I agree. Uh, you know, you are what you eat, what what goes in, comes out. And if you're feeding on outrage all day, then that's all you're really gonna see. Now there's the echo chamber too. And to be honest, uh, that feels more like what the algorithm does. It, it, uh, it amplifies and, um, reinforces your beliefs already. So it doesn't give you anything outside of what you're already feeling. So I, I think a lot of, or feeling against, oh, or feeling against. I, I, I think that a lot of social media is. Largely like an echo chamber. You are gonna get what you're feeding into it. So I guess you would ask yourself, how do you take back control? I, I guess back to the original question, how do you train your algorithm and the little experiment, uh, anecdotal experiment that we did. Um, so what I did and what Jeff pressured me into, uh, we talk about that. Uh, he, uh, had me just stop using it. He wanted me to delete it off the phone, but I didn't do that. And maybe that wasn't a, the, a full way to see if it reset my algorithm. I spent a full seven days without checking Facebook, and it was pretty tough. Uh, you know, I, what was tough about it? Um. In the beginning, I didn't want to give it up. That was, that was probably the hardest part. But once I did, I was like, hmm. I got more time to look at other things and, and go through some news feeds or whatever else that I was looking at. Uh, but, and, and you know, they designed these apps so well. I keep seeing that number creep up on there. 10 notifications, 15 notifications, 30 notifications. It's like, hmm, what are they? What are they? But I, at the end of it, I just, eh, self-control, that's a hard one for a lot of people, especially when you're addicted to something. And Yeah, I'd say I'm addicted to social media. Um, I, I, and you know what, that's the first step of recovery though, is admitting that you have a problem. That's true. I, to be honest, I, I deleted it off my Facebook, off my phone. Sorry, I did not delete my account. Um, it, there is a backdoor in, so originally I didn't know there was backdoor in, and the way in obviously is if you have Messenger, you can click on your profile and go to the web browser. Then the web browser will open up, not the app, and it shows you the notifications and whatnot. And it's, uh, very basic. But I can tell you that when I originally wanted to delete it was because I found myself, I was 12 minutes late in the military. Oh, I mean, let's face it, we're not in the military anymore, but uh, the military time. When you have to be somewhere still applies. Yeah. If you're not 15 minutes early, you're late. Right. And you know, the old habits die hard or you know, that's actually not a bad habit to have is, you know, trying to be either on time or early to where you're going. Uh, and I think it shows respect to whoever it is that you're meeting or wherever it is that you're going to not be late, uh, to, that's just a personal opinion. And I think that's probably why the military did that to, or why they had that instilled in the military like that. Well, I, I was in charge of the kids that day, so I had to make sure that they were ready to go to school and then get them in the car and take them to school. I was only 12 minutes late, but it still made me mad that I was gonna sit there and try to debate my friend over Tylenol and. I said, you know what? I deleted it and I can tell you that the addicting part was when I first opened my phone, I would search for that app. So I said, you know what? I'm just gonna delete the app because then I can't open it. Then It's been about a month now. It's been a while For you, you know what the outcome is. What is your outcome? I don't stress, I'm not arguing over like, or. Preparing to, or sometimes I would get prepared for an answer or I'd get, I guess I would get worked up and like, what is this person thinking? Man? They just don't, oh my gosh, this is blowing my mind. I gotta, I gotta tell 'em, gotta tell 'em. I don't, I don't have to worry about that anymore. I don't. Well that's, or partake in that, then that's, that's a good outcome for you. It, it's taken a little bit of stress away from you. A distress. So what it's allowed me to do to retrain it is to click on content that I want to see more of. I've noticed that on. X, they're asking, does this content interest you? Right. And uh, I think I've even seen that on Instagram. You can go on even YouTube, Instagram, Facebook. I don't know about TikTok 'cause I don't do the TikTok and. I don't know, I guess our age and demographics, we don't do TikTok or do other, uh, snap click or whatever. Snapchat. Snapchat, whatever. Snap click. I like it. Sound snap, click. Yeah. So I would just go around and like I said, click on stuff that I had more interest in. Uh, I would avoid going on Facebook, um, just because, uh. I'm trying to teach Facebook a lesson. I'm the boss. You're not. I'm the captain now. Oh, that's what it was. Uh, interested things on all of those. So on all of these social media websites that I mentioned, you can go, usually it's the three little dots. And you could say, I'm not interested in this. In the top corner of the post, uh, usually on the, typically where the name is, typically they were on the vi somewhere on the video you can find and say, Hey, look, I'm not interested in this. Now I've had that backfire for me because on some of my u uh, Facebook groups. I would say, I don't wanna see this. And then they would give me more. See now I've done that. I did that a while ago, uh, on, because I use Pandora a lot for music. And early on, probably five, eight years ago, uh, somewhere around there I had, um. I had started liking songs that they were playing, and now I get those songs like all the time on repeat, and I've, and it kinda limits the scope of the music that I'm listening to. I like to listen to like, just everything. If I click on a, if I click on a i, I like a song. I like that genre. But it would only give me, like that artist or something that was extremely close to that artist. So I've just stopped, hit and like on, on Pandora when I hear a song that I haven't heard before, or one that I haven't heard in a long time. So, so I'll go out and I'll, I'll search for individual songs if I want to hear something special or something new. One of the things about Instagram that I do is I like humor and I, I listen to a comedy show every night on YouTube, but my friends and I, we pass around memes that we find funny, and so that's always kind of entertaining. It keeps you out of that. Well, it keeps you in some type of algorithm loop. But usually the fun thing is somebody sends you something, you keep scrolling down that, that algorithm, and you can find other things that are funny that maybe they didn't see. So there's always that. But, uh, you know, using tools like lists, I, I think one of the things that you could do to train your algorithm is type in and search for a topic. Yeah, that, that's a, that's a good idea. And, you know, um, and like, and comment on other, Ty, other, those, those stories or, um, posts that you see. Yeah. You know, there's a like button not just for someone to get a, like, it's so they know whether you like it or not. So they could feed you more. You know, it's like a reward for good behavior. I do. I, you know, I, I'm sparing when it comes to likes to videos on some, uh, on, on YouTube, um, because I think sometimes people get a little. Uh, they go a little over the top and it, it goes away from their original message and their original intent. Uh, so I, if there, if I see something that is getting back to that original intent and the, um, the what, what drew me into that? Say channel, if you will. I'll give that a like, because that's, I, I think that does reward them one, but it also helps feed my algorithm. It helps me curate my, uh, my, my views or my, uh, my interests. So one of the downsides to algorithms is it also, uh, breeds the extreme. Yes. Uh, outrage keeps people engaged. Fear, therefore, and fear. So they're going to try to exemplify those things to keep you engaged, to keep you, I don't know what, I don't know what kind of money they get just by you. Uh, staying on their, their website. No, I, I think you're right though. But I think it's the outcome of things you buy. You know, it, it is. And I think that, uh, the longer you view something, they will, they, we keep using that term. They, the algorithm and the moderators, they, they are the ones that will, uh, um, you know, that's how, that's how they get their ad revenue because if you're inclined to watch certain types of videos, they, they're, they're other algorithms that say, well, if they're watching this, they're gonna be more. Interested in these products. So let's start targeting ads towards that product. So another way to do is you follow creators like us that feed good morals and ethics. I mean, you don't have to agree with everything we have to say, but I don't think there's been much that we've discussed that. Uh, you have to choose a side. I think a lot of what we've been talking about is just logic. Uh, I think we're more of like a glorified Webster dictionary when it comes to No, I, I think we're, we're talking about our lived experience in a positive way. Something that, uh, we don't see much on the internet these days. That's right. I think that is the nail on the head right there is hitting the nail on the head. I wanna say, make, make sure I say the same, right? Um, there we go. With our Webster dictionary stuff. I don't know if I want to talk about echo chambers. Left t of the right. I we go so much on the political divide in a lot of our, uh, podcast. I already gone past that 'cause we did the bridge. We bridged like, join this podcast. So you know, Travis, how we talked about a diet. Yeah, yeah, no, we haven't talked about a diet, well, a steady diet of certain things, and in previous podcasts we've talked about, you know, what, what comes in, what you feed your body is more important than actually the exercises you do. It's, and that goes for your brain and what you're feeding it, and content is feeding your brain and anything that you're good at or going to be good at, or to train your algorithm or train yourself. It comes down to consistency and you know, just like gaining weight, you don't put it on overnight. It's not gonna go away overnight. So, you know, physical health and mental health all kind of tie together more than you think it does. But when it comes to algorithms, you have to. Feed yourself the healthy things, you know, instead of waking up in the morning and going straight to Facebook and getting caught in a political argument. Why don't you do something else instead of clicking on that political argument? Click on the best barrels of the day. I, I watch a lot of surf videos too. Everyone, uh, when I get on Facebook and it will show me what's new in the surf world. So it, it, it, to me that's lighthearted, it's apolitical. It's, uh, it's a good diet to just make you feel good and say, yeah, I need to get back out there. Maybe the real question isn't what's just on your feed, it's what you're teaching it to value. That's a good point. What is it teach? What are we teaching it to value? I had a friend mention to me when I was preparing for this and he said, you are, you could tell a lot by what somebody's algorithm has on it. And we said that earlier in the show, but to go back to that. I don't think you're necessarily saying that this person is bad, but where their interests at. Um, what, what is more of their algorithm about I, you know, I obviously explained I have different social media paths for different things and I explained what my interests are in those in the podcast, and so there could be, you know, I could take politics out of it, but I like to know what's going on in the world or. At least somebody's perspective of it. I, I, I do too. And that's, to me, that's, that's why I use, that's the way I use different platforms. I, I use X more for the political news and to stay up with what's going on in, in the local air, not necessarily local, the nationals. Scene. Facebook is more friends and fun and, uh, groups like, um, car groups that I, that I joined so I can get information on how to do certain things with, with the cars, uh, that I do have. And, and so I'm, I guess I have trained my. Facebook algorithm to be more geared towards cars, and my ex algorithm is geared towards more political. Um, as far as, uh, Instagram goes, I haven't really trained that much. I don't get on Instagram a whole heck of a lot. Uh, but I do get out there so we can start posting more from the, from the podcast on there. I was gonna say, I was like, no you don't. 'cause we've been lacking there little bit. Yeah, we have, we have. Uh, but uh, as soon as we get any reels in, I'm trying to get 'em up there as soon as I can. So we encourage you to train your algorithm. Don't let it train you. Turn off the microphones if you don't want to be heard. But I don't know, if you're looking for something and you want information. Maybe you should keep 'em on and you know, we gotta try to take back the control of our algorithm. You know, if we can move past the noise, maybe we can come together and find a more common ground amongst each other in real life. There you go, Travis. That's it right there. Common ground. That's what this country needs to get back to is common back. I think we've gotten closer to that than, um. What the, uh, news algorithms are feeding us. I, I do believe that we are. Coming back around to that, uh, full circle to be back together again. I think the divide is getting smaller, but if you listen to certain, um, news or just, uh, pundits, if you will, people just out there talking, whatever, uh, they may. Seem like it's further apart than it we've ever been, but I don't honestly believe that, uh, especially in the jobs that we do. We're out there talking to people daily and when you're out there talking to somebody in person, we have a lot more in common and people are nicer than you would think. So hey, pay attention to what you're feeding your algorithm this week. Are you training it or is it training you? That's a good question. Be on the lookout. Thanks for watching. Peace out.

Other Episodes