Investing in Intelligence: AI Stocks and Options
AI stocks enthusiasts, explore comprehensive insights and analysis on artificial intelligence market trends. Get the latest AI investing updates, expert commentary, and strategic investment approaches tailored for the AI industry. Uncover AI options trades and AI market forecasts designed to boost your AI investment portfolio. Join our community of forward-thinking AI investors and confidently navigate the AI revolution. Embrace the future of artificial intelligence investing.
Investing in Intelligence: AI Stocks and Options
AI Robots for Surgery? Intuitive Surgical (ISRG) as an AI Stock
AI robots for surgery? Intuitive Surgical may eventually make it a reality. We discuss Intuitive Surgical (ISRG) as a potential AI stock, debating the capabilities of robotic surgery and the potential for future advancements in medical artificial intelligence and the computational power of surgical robots. We then discuss the challenges and liabilities of autonomous robotic surgery and ISRG's dominant position in medical AI data collection. We also cover $ISRG's financial performance and evaluate it as an AI stock investment.
Intuitive Surgical (ISRG) AI Stock Takeaways:
-Intuitive Surgical (ISRG) is a leading provider of robotic surgical systems and has no competitors in the market.
-While robotic surgery has made significant advancements, the current systems do not possess true artificial intelligence.
-The future potential of robotic surgery lies in the integration of data collection and analysis to improve surgical practice.
-While autonomous robotic surgery may not be feasible in the next five years, ISRG is taking the initial steps to build a trove of AI data.
-Intuitive Surgical's financial performance and market dominance make it an attractive investment option.
Intuitive Surgical (ISRG) AI Stock Chapters:
0:00 Overview of Intuitive Surgical ($ISRG)
2:59 Discussion on the Future of Robotic Surgical Systems
5:00 Exploring the Potential of a More Computational Robotic Surgical System
7:44 The Possibility of Auto-Drive Features in Robotic Surgery
9:15 The Evolution of Intuitive Surgical's Robotic Platforms
12:26 The Use of Data and AI in Surgical Practice
14:35 The Potential for Autonomous Robotic Surgery
16:00 The Challenges and Liabilities of AI Robotic Surgery
20:03 The Role of AI Computer Vision in Robotic Surgery
22:19 The Fear Factor and Human Acceptance of Robotic Surgery
23:46 Intuitive Surgical's Market Dominance
28:08 The Current Impact of Macroeconomic Factors on the Stock Market
James (00:07.884)
Welcome to the investing in intelligence podcast where we talk about artificial intelligence companies, stocks and trading. I'm James here with my cohost Kai and today we'll be talking about intuitive surgical ISRG. I want to remind you that the opinions expressed on this podcast are just that opinions. They should not be taken as specific advice to invest in a particular way.
So we're gonna kick it to you, Kai. Can you give us an overview? What is Intuitive Surgical? What does this company do? Roughly how long have they been around? What are their main products, services?
Kai (00:39.818)
I could talk a long time on this subject. Intuitive Surgical essentially is a company which provides robotics. It's called robotics because it has a computer associated with it. It's essentially a tool. And Intuitive Surgical came out with its first robotic platforms in the early 2000. And so I think it started with the Model S and then the SI. And now we're on XI.
And forgive me if I'm wrong, I think that they just came out with a new platform, the D5, which is either their fifth platform. I've seen every robotics platform from the S, the SI to the XI to this new platform, the D5 platform. But essentially, what it is is a four armed robot.
Robotics in of itself implies artificial intelligence. However, the only reason why this is classified as such is because it has a computer associated with it. And so it does not have any sort of artificial intelligence associated with the robot other than it has a computer.
a computation platform with it. And that basically means that the surgeon will sit at a console and then put his hands in the console and operate with the robot. And so laparoscopic surgery, over the last 50 years, surgery has turned.
on a pathway of a minimally invasive approach to surgery, meaning smaller incisions, less invasive. And so in the 1980s, 1990s, they came out with what's called laparoscopic surgery, which is essentially the surgeon is operating through a small incision, a camera on the end of a stick, so to speak, and it uses optics and the surgeon will put two other sticks into the patient through a port.
Kai (02:30.506)
and then the surgeon will use those instruments to operate with. It's the same thing, except in this instance, the surgeon is controlling those instruments through a robotic console. So that came out in the 2000s, Intuitive Surgical Company has drastically grown since that time, but essentially that's what the company does. It both makes this robotic platform and then the instruments that go along with this platform.
James (02:59.566)
Very cool, very cool. Sounds a little bit like Tesla. You said they have a Model S and they had a Model X. Sounds like maybe one took inspiration from the other. So I want to jump right into the things I'm most interested in. I actually would like to play just a moment from their conference call. So let's jump right into a second from their conference call here.
James (05:00.654)
So future programmability is the big key phrase that I pulled out of that. And he's talking about over the air upgrades, almost like a Tesla. So it sounds to me like they have ambitions of making this a more computational. That's actually a word that he used, a more computational robotic surgical system. So without even thinking about data and AI, which everyone on their English calls has to have mentioning data and AI because...
We don't want to jump into the thought of a robot doing surgeries on people without a human there, obviously, just like it's even still scary to think of a robot driving a car without a human monitoring it. Not to mention questionably legal. So in this case, he's talking about a more powerful computer in the latest DaVinci robot. Why does that matter? Not putting aside data automation, not even thinking about AI.
Why does it matter that there's a better computer in this system?
Kai (06:00.682)
Well, I think my key part of what he said is he said starting point. So let me give you an instance is this. Progressively the SI and S systems were ergonomically much different than this. XI system, which has been out since the 2014. It changed the way we were able to do surgery because of two things and, but multiple in general, but the camera is 10 times better.
Number one, and number two, the ergonomics of the system make, for example, doing certain surgeries possible via a minimally invasive approach and or much easier. That being said, okay, computation power in general,
is meaningless, I don't wanna say meaningless, but I think what he's saying in some of these things is the start of something, not something that's already utilized in the system. I don't wanna be able to provide an expertise on how a computer works, but how a computer works in this system means nothing to the surgeon.
The surgeon, the robot in itself doesn't do anything on its own. So for example, I remember seeing an episode of the resident and one of the first episodes, they had a particular scene where the robot was doing something to an apple or a pear. And then all of a sudden flipped out and started like slamming the pear against the table, you know, and that is that is impossible to do with the DaVinci platform because it does not have a mind of its own.
James (07:35.47)
So why do we need a better computer? They're improving the processing power of the new one. Why do we even need a better computer in the robot then?
Kai (07:43.178)
Well, I think from a starting point, it could be things that could be possible in the future. For example, one particular thing that may be possible in the future is seeing how a surgeon performs a certain skill and then an auto -drive like feature for that skill. But the changes that are made are so minute or so minimal that it's a definite starting point.
James (07:49.23)
Okay.
Kai (08:09.77)
to go from utilizing a tool to an actually using a robot is it, we're not even close. This is the actual, this is the starting line, not middle of the race. And so they haven't changed the robotic platform to do that yet. Even this new release system, maybe they're making the computation part of the robot.
James (08:19.278)
Okay.
Kai (08:34.698)
able to do that in the future or they're starting to incorporate those things into a possibility, but ergonomically it's impossible. They don't have the ability to do that.
James (08:46.19)
guess what I'm wondering is what is the size of the computer in these things? So what you're saying now sounds to me like there's not even a big enough computer in it to really do anything on its own. Now we know that in a Tesla, there's a good size computer in the Tesla and they've actually had to upgrade the computer in the Tesla over time in order to do better self -driving. So what I'm really wondering is like, is there, is there visibly like a computer, is there a GPU in it? Is there a CPU in it? Is there.
Kai (09:06.344)
Yeah.
Kai (09:14.794)
That's an excellent question. I almost look at the robot as like the S model was like a basic Prius with a series radio in it where you turn it on. And the newer models, for example, now you have a screen in the car that gives you a little bit of a navigation system. You can log in. You can put your preferences. For example, maybe you like your electro -cottery, which uses...
James (09:16.238)
like USB ports.
Kai (09:43.498)
it uses current to cut tissue, maybe you like that on a particular setting, for example. And so you can set that to a particular setting, but...
The interface has gotten better. So let me give you one example is now you can log in. This newest system even has a place where you can charge your cell phone for convenience, but you can log in as the surgeon and then change certain things. So one of the things you do with laparoscopic surgery is use insufflation to create space inside a body cavity. And so you can control the setting on that via your computer on the robot, but it does not do anything.
on its own. You can't push a button and it do anything on its own.
James (10:27.534)
Sure. But are you saying a few generations ago, it wasn't even possible to log in? Like there wasn't even a keyboard? Is that what you're saying? Like a few generations ago?
Kai (10:34.666)
I don't remember that. Yeah, I mean, it was a pretty basic system. Ergonomically, the robot changed from, he's mentioning the XI model, which was like 2014, I believe, don't quote me on this. But the SI model was kind of, there was a big ergonomic difference. From this new model that's out to the XI model, there's not much ergonomic difference to it. There's very little. It's just up,
James (10:38.734)
Yeah. OK.
Kai (11:02.986)
And they may take this as a critique, but it's just upgrading the model of your car to a new fancier model. It does have some things, for example, resistance that you can feel with movement. For example, James, if I'm pulling you and I'm grabbing your arm and pulling you, I can feel some resistance with the robot. You can't do that. The robot is doing the actions that you're telling it to do or moving your hands to do. But your eyes eventually.
James (11:21.934)
Yeah. Yeah.
Kai (11:32.586)
your hands start to feel it's an odd, it's an odd sensation. This new system has, gives you feedback in regards to resistance, you know, things like that. But I will say this, that I do see this. That's why I think the key is the starting point because I do see with data collection in the future, the possibility for them to integrate video surgery into.
the development of their robotics. I do see that. And I don't even want to say robotics, but maybe their instruments and or computation power. That's where I think he's saying starting point because this new system, intuitive surgical has created a is creating or has created a cloud network system and application that you can put on your phone and you can download your surgeries to that. So.
James (12:26.158)
Yeah. So right now what we're looking at on the screen here is something called my intuitive. I just pulled this up on their website. You know, it says here that they're giving an overview of, um, actionable data for surgical practice and robotics. So what day, I mean, let's be.
Kai (12:44.074)
That's the data you're talking about. So the actionable data that they can collect or time or instrument usage or videos. I mean, one of the things that helps people in training is watching videos of other surgeons operate or even themselves. And so you can improve by basically, let me use a sports analogy, but watching your practices, watching your game film and.
than improving upon those things. And so I think that's what intuitive surgical is aiming to do here. And I think it's great. The aim is they have really changed the game, especially in minimally invasive surgery. Surgery hadn't changed for a long time until the 1980s, 1990s to minimally invasive surgery. And now DaVinci intuitive surgical has changed the game.
James (13:37.262)
So I think what's scary about this topic is the idea of a robot actually not only taking actions, but taking actions on the human body. And that's a terrifying thought. But we already know that companies like GE Healthcare and Johnson & Johnson are gathering visual data. They're gathering videos and images from medical devices to build their own AI models for medicine. So.
If they're doing it, intuitive surgical probably should be also. Now I know that GE healthcare and others are focusing more on ultrasound and CT and MRI, those kinds of imagery. What about surgical videos? And just cut to the chase for us. If it's possible for a robot to drive a car, if it's possible for an AI model to drive a car, like the new Tesla full self -driving, do you not imagine it eventually being possible?
for an intuitive surgical robot to perform some surgical procedures on a real human being.
Kai (14:40.874)
Not in the next five years, impossible. I will say that and I'm a hundred percent confident on that. I think the chances of a, an actual, now you could call this robotics now because it's not the surgeon, it would be the robot performing any sort of part of the procedure in the next five years is impossible. I will just go out on a limb and say that number one, you mentioned the, the, because,
James (15:02.094)
Why is it impossible?
Kai (15:05.192)
Computation power in regards to measurements is much different. Measuring a road or a map using Garmin systems is much different. We don't have a map to the inside of our body, number one. Number two, it's akin to flying. The airline industry is so good at minimizing error and plane crashes, for example, otherwise nobody would fly. There's a certain amount of error just in general that a surgeon has. For example,
injury to a severe injury during a gallbladder surgery, one of the most common surgeries we do in the United States is one in a thousand. I think it's more than that, but it happens no matter what it will happen. And so there is a certain error that is associated with it and that will never be zero theoretically. And so I don't know if we're able to or willing to accept that in regards to robotics yet is what I'm getting at.
James (16:00.718)
Yeah, my concern is actually the litigious nature of medicine. So we know there is also a highly litigious nature to driving. And we know that auto accidents lead to a lot of lawsuits, but they may not be comparable. It may be that there's actually more potential for litigation in the medical world. And eventually the physician who's performing that procedure or surgery will probably own the ultimate responsibility. So.
I do think there are a lot of comparisons between driving a car and these medical robots and the potential for AI in both. But my concern is that no physician wants to take the liability for any autonomous system doing anything on a patient if their name is ultimately responsible for that patient's care. What do you think about that?
Kai (16:52.49)
Yeah, and we, well, in this regard, it's, that's why a lot of surgeons or a lot of doctors will talk about art, right? It's not a practice, it's an art. And so the question is, is when can intuitive surgical system be able to perform art? Because it's not just a standard approach to doing everything.
There's a lot that goes into it. And I'm not saying that a computer can't be smarter than a human. Don't get me wrong. I'm not saying the main argument I'm having for this is the data is not there. So the data for the artificially intelligent robot to be able to do it is not there. And so that is the starting point that I think potentially, I don't think he's alluding to that, but I think that being able to develop that computational power or an actual robotic system to perform even a part of the surgery.
the data's not there. They have to get a lot of videos to being able to do that. So let me answer your question. So if surgeons are watching videos, they're doing either three ways. YouTube, which is probably the largest, one of the largest platforms to be able to watch surgeries that's out there. Facebook, or they're doing it through their own videos. They're watching their own videos or they've had their own videos somehow and they've kept them.
I have a fan, my grandfather used to have tapes of his own videos. He would record or get other videos from other surgeons to watch them on a VCR player. And so until a company is able to input all of that video data and then translate that to an actual, this would actually be called a robotic platform. Then they would, I could theoretically see something and they'd have to start small. I do envision a scenario in the future where you say, Hey,
I've performed this part of the procedure the same way over the course of a hundred procedures autopilot this part for me. I could envision that, but, and maybe they're developing that.
James (18:49.55)
Yeah. Yeah. And I think also, yeah, and potentially insights also, one thing they mentioned on the website is insights. So it's just the movements within the human body are so much more minute than a movement on a road for a car driving. So the Tesla is looking at lines on the road. It's also using GPS and other things like that to determine its location. But essentially the Tesla uses.
computer vision, what we call computer vision. And it's just a huge amount of data, this data you talk about, which is imagery and videos from other cars driving, built into a model.
Kai (19:21.93)
Yeah. And James, to use, well, and to use the road analogy, now the lines on the road are always differently spaced. They're driving in all different directions. There's no map for them. They could be, there could be two lines crossing, right?
James (19:30.254)
Exactly. Yeah.
But there are limits to the variation in the human body. But there are limits to the variations that can exist in the human body. And then maybe those can be slightly calculated with things like BMI, body weight. So maybe eventually it may be possible to put that data into the computer, into the DaVinci robot. And the DaVinci robot can make some calculations based on that. And also, I just think we don't understand the true potential of computer vision. So it may be possible for...
the robot to eventually with a proper AI model, make an incision and then look around and immediately make measurements and kind of know. I mean, so what it comes down to is I believe if you can teach a human surgeon to do these things, you can teach an AI model to do them. We're just not there yet with the actual physical robotics part of it. So you could go ahead and you could actually already take all the videos from all the surgeries that were done. Let's say over the last 10 years.
Kai (20:23.594)
You.
James (20:31.086)
And you can already develop an amazing model that would make a lot of conclusions, insights, things like that. But to actually connect it to physical action is tougher, I think.
Kai (20:36.106)
Let me talk just for a second. Right. Let me talk for just a second in regards to the, what you're alluding to is the ergonomics. And so this system actually is quite, I don't want to call it basic because I think it's revelationized surgery, but the only difference from laparoscopic surgery that this system has, well, not the only, but one of them is that the,
Arms will move like your hand. They're not your hand. They don't have fingers, but they can move 360 degrees. Okay. Laparoscopic instrument instruments can do this. And so you have to move your arms around to do this. So to be able to tie something that's even better ergonomically, you'd have to be able to design that via engineers and then tie that to the computation power.
James (21:03.758)
Okay.
James (21:12.142)
Okay, yeah.
Kai (21:27.69)
of video analysis to then design a whole new platform, probably. Or I could envision something in the future where through computation power, the robot is, is able to move those instruments the same way that the surgeon had moved those instruments over a particular portion of the procedure. Sort of like I want to drive down the highway, but I only want to do it for a mile or two at 50 miles an hour.
go. We're so far off in the future of, and this is, I'm stressing this point from a patient perspective, because I think it could be a mistake from a company standpoint to say that this computer has artificial intelligence because it does not. There is a fear factor that goes into that to say,
Oh, the robot's just going to spaz out and cut me in the wrong place. Humans aren't ready to be able to put their lives in the hand of a robot yet to perform a surgery. I'm only saying that in regards to that. That's part of the issue here. And I think they're going to have at some point, technology will have to overcome that barrier as well as the human fear part. But just from. Yeah, no, just from the.
James (22:44.014)
Sure. And I think, I think the, go ahead.
Kai (22:48.042)
I want to say one thing too about Intuitive Surgical, since we're kind of going to go over companies, is there is no competitor. You may mention some of these, there is no competition. They own this market. There is no other robotics platform that is anywhere close that I have ever heard of. And I'm saying that with full confidence. So.
James (22:57.166)
That was my next question. That was my exact next question. Yeah.
James (23:10.318)
Look, so I see a forward PE of 64 and I see a trailing PE of 79. And you know that when this stock was starting to drop last year around October, when really everything was dropping, but I was questioning the valuation here. I was questioning the multiple and saying, okay, market is going to look at intuitive surgical and say, why does it deserve a PE of 70 when a lot of other medical companies, including medical device manufacturers have a much more reasonable PE.
So are you saying this company receives the premium here just because it's a monopoly?
Kai (23:45.45)
I'm saying it has no competitors in this particular industry. I'm saying that, so, well, yeah, I'm saying when you're looking at a company in the basics, you want to look at competition sometimes. And because that's going to take away from that company's market share. Intuitive Surgical has no competition. So...
James (23:49.294)
OK. That's a nicer way to say what I said.
James (24:05.422)
Yeah, I like their margins too. 25 % for profit margin, 23 % for operating margin. It looks like they have great...
Kai (24:11.658)
their net profit margin is 33%. I mean, so let me give you, so back when the company, let me give you an example. So when the company dropped to 255, it dropped down to like 250. And that was back in October. It did that on, sometimes you buy the news and sell the fact, right? But it did that on the news of these GLP inhibitors and that...
And potentially somebody said something that said we are having less procedures done in regards to bariatric surgery. And even if that was the case, intuitive surgical makes its money likely from two ways, either the cell of the robot, but more importantly, the number of procedures that are being performed. And so.
What the company has shown is each year there's a steady number of procedure growth in the United States. That continues to grow. Okay. And that's where they make their money. They make their money is because the instruments are not utilized for the robots for every procedure. They're continually having to be purchased. And so that's why you and I talked back in October and I said,
What I look for at intuitive surgical is procedure growth because as the procedures continue to grow, money and revenue will continue to grow. And so your question of is the P to E ratio of 80, should that be appropriate? I was one I'm understanding. No, I don't think it's a P to E ratio of 80. Let me just say that. But I do, I do when I'm calculating a price target intuitive surgicals at like 399, I I'm calculating somewhere around 450 or 500.
James (25:38.414)
Sure.
Kai (25:58.986)
because it's price to book ratios about 10. So it's a fairly, it's a fair valuation. It's market value is a little as high, but it's EPS is going to be about eight potentially, I think this year. And then I do give it a PDE ratio of about 50. So for all those out there, I'm doing the math. We're, we're multiplying EPS for the year times the PDE ratio to get our price target. And so.
James (26:27.022)
So what I see is if it, so if 8 .34, which is the highest estimate of next year's earnings actually came true and we gave it a PE of 50, I calculate 417 for the price target, which is where it is right now. So I think for us to go much higher from here, we would need.
Kai (26:49.066)
Well, do do that times, go ahead and do that times a PD ratio of 80, which if you look at it's, if you look at its macro trend, well, why, why does SMCI get a price to book ratio of 18? I mean, if you compare. But also.
James (26:54.862)
Well, that's my call. Why does it retain a PE of 80?
James (27:04.488)
SMCIs forward, the SMCIs forward PE is way lower, way lower than ISRG, their forward PE. ISRG doesn't have the growth that SMCI has.
Kai (27:11.434)
I'm just, it doesn't, but it has steady growth over the last 10 years and it doesn't have any competitors. And so you could say SMCI is more overvalued by far. So you're talking about overvaluation and I'm making the point to you that SMCI is way overvalued more than what intuitive surgical is overvalued is what I'm arguing. Would you agree with that or disagree?
James (27:38.958)
Fundamentally disagree the forward PE I just looked it up for SMCI is 32 .47 Whereas for ISRG we just mentioned it's significantly higher I think close to 70 so I just fundamentally disagree with that But what you are right on and what I agree with 100 % is that SMCI has gone from this to this and could potentially go back down But we know ISRG is just staying steady. We know that ISRG has been a dominant force in these
is the only dominant force in surgical robotics and will remain that way. So there's just no, there's no risk in it. There's no competition. You're pretty much guaranteed. I mean, with a company that has that kind of market dominance, they could raise prices as they wish and get away with it because no one's going to turn and go to the competition because there is no competition.
Kai (28:25.226)
So the new robot platform, well, absolutely. So the new robot platform is about 2 .5 million. I think in the other one, the XI system was around 1 .8 or $2 million. Let me just round up to $2 million. So each time more procedures are being performed, that's more revenue, they're gonna go sell their new platform around the country. And let me just say this, there's still a lot of hospitals out there that don't have one of these coupled with the fact that...
James (28:36.558)
Okay, sounds good.
Kai (28:51.338)
New surgeons are going to have to use this. They've been trained on it. It's not like I can give. It's isn't the same thing as me giving you an iPhone and saying, learn how to use it. These surgeons have been utilizing this system for the last five years and being trained 10 years being trained on it. They're going to require it in regards to being able to perform surgeries. And so it's brilliant. When ISRG have done, I think price target realistically at ISRG, if we don't go into a bear nut market is around.
James (29:03.406)
Yeah. Yeah.
Kai (29:20.586)
I'm just going to ballpark it, but around 500. And I don't think it's unrealistic to keep its P E ratio at 80 with an EPS for the year of eight. So that, that P E ratio for ISRG means everything, which is the market, what the market's trading it at James, but it has still a lot to grow. And if it dropped at all, buy up as much as you can buy up as much as you can, because SMCI, unlike ISRG.
James (29:46.638)
So.
Kai (29:49.798)
SMCI is a huge unknown in five years. ISRG is a sure thing. It's 100 % a sure thing.
James (29:55.79)
Of course, that's not investment advice to buy as much as you can, but the two things that I really like, the two things that I like the most about ISRG, first of all, eventually there will be more autonomy in medical robotics and the winner will obviously be Intuitive Surgical. So the fact that they're already starting to try and gather data, they've got this new platform to encourage people to upload videos. That's the number one thing that I really like about it.
Kai (29:59.69)
Well, of course not.
James (30:25.454)
Um, I guess the number two thing I really like about it is just their reliable, consistent performance over time, um, and their market domination. So, so definitely an amazing company here and one that could be a great thing to have in someone's portfolio if it's right for them after they've done their own analysis. Now, I did want to just mention a couple of things because you, you said a second ago, something about a bear market.
I'm still not seeing that. And I'm pretty pleased with what we saw in this most recent PCE report. It was kind of a mixed report. The people who broke it down into sub segments saw a couple of things they liked, a couple of things they didn't like, but overall it was in line. So it looks to me like we're going to have maybe a week or two of floating along in a bull market. And then we get CPI and PPI in a couple of weeks and the market will have to reevaluate where we stand there. But
With this PCE report, I don't see interest rates shooting up on Monday. I don't see the dollar moving drastically on Monday. NIC stocks probably sticking with it for a week or two. What are your thoughts on where the economy as a whole is and where the stock market as a whole is?
Kai (31:32.554)
So my observation of last week is that the market traded sideways for about three days because everybody was waiting to see what the PCE was on Friday, which was a market holiday. And now that's been in line. I'm really interested in to see this next week. Although I said, like our first podcast, I think we're up, up and away from here. I think we're still going to be in a bull market. So let me that be said, I think there's going to be a good reaction to it this week.
James (31:38.894)
Yeah.
Kai (32:01.93)
I think we're earnings is what people are going to look to next. Right. Would you agree with that?
James (32:07.246)
Yeah, yeah. Oh yeah, yeah. Earnings coming up soon. So starting in April, mid April, we start to get earnings again. So market might pause for a week or two until we start to get those earnings and maybe move on earnings. But look, you look at...
Kai (32:17.322)
It just, it just blows my mind. I mean, I don't see a scenario. This is what they said the best Q1 for a long time. I mean, are you going to envision? I kind of envision that these, this next quarter's earnings might shake things up again, again, because it's sort of like back in October. I mean, if they catch a little sniffle, then we may see people sell off at least back to the 50 day. I'm calling for the 50 day EMA at some point in the next three months. What are you, what are you calling?
James (32:48.206)
Earnings can be sliced and diced in so many ways. So since we're so much of a tech economy now, do we just look at the earnings for the big tech companies? I mean, I feel like I hear people come out and say that earnings weren't as good as expected and were better than expected with the same earnings, with the same group of earnings. So I feel like it can be sliced and diced in different ways. One of those things where you can look at it and take two different conclusions for two different people.
So I guess what I would say is I feel like things hinge more on macro economy right now and the inflation numbers and how many times we're going to get a cut from the Fed and what those cuts will be. I feel like things hinge more on that and unemployment, of course, unemployment. I feel like those are the biggest things moving forward for this year. But I can see.
Kai (33:29.29)
Three cuts or two, James. Three cuts or two.
James (33:34.062)
I don't know, I don't know, but I can see that crypto, Bitcoin is hovering around 70 and Bitcoin just does not want to go down any further. So you see these high risk assets that a gold, gold is ready for the cuts.
Kai (33:43.53)
The greed index is pretty high. The greed index is getting up there, James. So let me say this real quick. The last thing I'll say about ISRG. I'm going to go back to that. This is not an artificially intelligent company. This is not an AI company. So one of the reasons why we're doing this is because some people could view it as such. It's not that. And so I want to be clear about that. We'll talk about other companies on this podcast as well, but.
James (33:49.664)
Yeah.
James (33:58.83)
Sure, sure, fair.
Kai (34:10.922)
In regards to artificial intelligence, that is not us. ISRG right now, maybe at some point in the future. And yeah, James, I don't know. I'm thinking the three rate cuts going back to that because I think they're going to push to get Biden reelected. So I think, I think they're going to do the three rate cuts. I think they're going to say the P or the PC is in line. And I think that that's the bright, one of the bright spots, honestly, for Biden is that the market is doing well and the economy is doing well right now. So.
James (34:41.646)
Yeah, well, time will tell, but we will be there to cover it. So thanks, everyone, for listening to our podcast. Please give us a rating review or on YouTube, a like or comment. And we'll be back in the next episode to cover more AI stocks and broader market information. You can find all our holdings and trades at investingintel .ai. Thank you so much, Kai.
Kai (34:59.146)
Thanks, James. Talk to you soon.
James (35:02.72)
See you next time.