Apple ‘Illusion of Thinking’ Debate, DuckLake Lakehouse & Magistral AI

Murilo:

Hello. Actually, I don't have something planned, but maybe we should start again. A few moments later. Hello. Welcome to the monkey patching podcast where we go bananas over all things data and AI.

Bart:

Hey, Bart. Hey. Hey.

Murilo:

How are you? Long time.

Bart:

Long time. No seeing it. Where have you where have you been?

Murilo:

I've I've been now. Where to start? I actually, after since we recorded, we went to Sicily together.

Bart:

True. We

Murilo:

went to a wedding. We danced. Yeah.

Bart:

Oh, boy. A very good we did a very good performance. Right?

Murilo:

Yeah. The best. Yeah. I think they're still talking about it.

Bart:

We did a, yeah, a classic Backstreet Boys.

Murilo:

Yeah. Not only singing, but also interpretive dancing as well. Yeah. Shout out to Ardalan, our friends Ardalan and Jonas Yep. That composed our troop.

Murilo:

I'm gonna give the name, but yeah. And after that, the weekend after, actually, I went to Japan for my honeymoon with Tokyo Tokyo, Kyoto, Okinawa. It's pretty cool.

Bart:

Pretty cool. Sounds cool. Never been there. Never been. What was, of those three weeks, the coolest thing?

Bart:

You need to choose one. I think the okay. If I

Murilo:

had to choose one, but it's like it's just it's just to tell people more. But, like, there's this park in Nara. It's like a city in near Kyoto that they have, like, deers just walking around. So they're just deers. You can go.

Murilo:

You can pet them and stuff. Sometimes they walk away, like, whatever. And if you give them food and you bow to them, the deers bow back to you.

Bart:

What? Yeah. Like, for real. Like, you go like this,

Murilo:

and they're like, oh, yeah. Okay.

Bart:

You said? Yeah. Really. I can show you video afterwards. Oh, that's crazy.

Bart:

Yeah. I think that's the They really were conditioned to I think so. Wow.

Murilo:

They really go like this and, like, okay. Yeah. Give me the stuff. So that was pretty cool.

Bart:

That's funny. But I think it's different. Japan meme. Yeah. He's like, even the deers are polite.

Murilo:

You know? It's like but no. Yeah. But it was very different. Like, Tokyo, I feel like it's, like, huge city, a lot of people, but it's efficient as efficient as it can be.

Murilo:

Yeah. A lot of metro, a lot of stuff. Very interesting to see, like, a city in that dimension because I'm from Sao Paulo, which is also a big city, but things don't work well. Yeah. Yeah.

Murilo:

Yeah.

Bart:

Yeah.

Murilo:

Then we went to Kyoto, which is more cultural. There's, a lot of temples. There's this. There's that. More nature.

Murilo:

And then we went to Okinawa where my on my mother's side, my my family comes from Okinawa. So I learned a lot about the history. I was even able to track some names and stuff. So that was also very interesting. So I think for me, it was very more meaningful, the last part of the trip, the Okinawa one.

Murilo:

Yeah. But I think if I had to just kinda give, like, a fun, like, oh, did you know when this happens? I think we'll see the the beers.

Bart:

Nice. It's

Murilo:

cool. And then I came back, and then I went to a training in Chamonix in France, border border with Italy and really close to Switzerland as well.

Bart:

And what did you do?

Murilo:

So there was a training program for leadership training program where we kinda follow, like, rescuers, like mountain rescuers. They talk a bit about the experience. They simulate some things, and then we reflect a bit on how these leadership traits can be transferred to other, like, workplace.

Bart:

Very cool. Very cool. It was very, very cool.

Murilo:

Very intense as well. Like, it was also very stressful. Like, everything's urgent. You know?

Bart:

It was really in the mountains. Yeah.

Murilo:

It was really in the mountains.

Bart:

In alpine terrain?

Murilo:

Yes. Yes. It was really cool. They they still have snow because I went, like I came back two days ago, and people are still skiing there. Yeah.

Murilo:

Cool. I was like, this is crazy. Yeah? This is crazy. What about you?

Bart:

I've been well, a bit of everything and a bit of nothing. It was the best best way to summarize it. Okay. Well, for the people that don't know, I I quit my job two ish months ago. Yes.

Bart:

Right? I stopped the date routes, and now I'm a bit looking towards the future, thinking what I want to do. And also in the context, I already have a lot of lunches and dinners and events that I go to. And I'm a keep myself enjoyed with that Yeah. And a bit chilling.

Bart:

Yeah. Yeah. I went cycling in the morning at 09:00. You're like, never. I've never done this Yeah.

Murilo:

During the week. Yeah. I was like, wow. Didn't know. Yeah.

Murilo:

Everything's empty. It's like Yeah. Send me send me a tempo, I guess. Yeah. Enjoying it?

Murilo:

Certainly. Are you enjoying this new pace?

Bart:

What I very quickly realized is that I need a golden life.

Murilo:

You need something. Yeah. I need a golden life. Yeah. Yeah.

Murilo:

I can I can imagine? I'm

Bart:

at the moment finding that, let's say, that professional golden life.

Murilo:

Yeah. I see. I see. I see. Cool.

Murilo:

And here we are, episode one before the week. This is the recorded episode, but I'm calling this episode one.

Bart:

No. No. It's gonna make it difficult. You think so? You because you said to

Murilo:

Before it was episode zero. Yeah. But it doesn't work. Because, like,

Bart:

what Yeah. Yeah. We like, for the people that followed the one, we agreed that we're gonna be zero index, that we're gonna start at zero. Like, good developer? But What?

Bart:

But the publishing platform that we use

Murilo:

doesn't allow for that.

Bart:

Doesn't allow for it. Really? So we're episode two today. Okay.

Murilo:

But it fits so nicely because this is the real where we bring content. Right? We talk about these things.

Bart:

Just accept them. Accept them. Some things in life you need to accept.

Murilo:

I know. I mean, it's yeah. I'll try. I'll do my best. But so maybe for the people like, yeah, if you haven't heard us before, the idea is that, you know, we usually stay up to date with what's happening in the tech world, data and AI and all these things and things that we would normally already talk about, like, share with each other.

Murilo:

And we write them down, and then we have some well, the idea here is to have a a quick overview, and then we can discuss a bit what you think about it. Right? So Yep. You wanna double click there?

Bart:

So so what we try to do for this podcast is we have a we have a chat channel between Milo and me. And whenever we read interesting stuff during the week, we just post a link in there so the other person reads it as well. And before the before the episode, we select eight articles, and we go over them. We're gonna be at we do this a bit more formal than the previous podcast that we had, the data topics podcast, in a sense that we're gonna read a few small piece of the article with a little bit more formal summary, and then we're gonna have more of a freestyle discussion on this.

Murilo:

Yes. So would you like to to start?

Bart:

Let's do this. We're starting with DuckLake. So DuckDB's new DuckLake format proposes shifting all lakehouse metadata into a regular SQL database so open format data lakes can gain true transactional speed and simplicity. Duck Lake Duck Lake reimagines what a lakehouse format should look like. The authors write, arguing it eliminates the maze of JSON files and external catalog services.

Bart:

If adopted, Duck Lake could let organizations treat back Blob Storage like a fast, aced compliant warehouse without vendor lock in.

Murilo:

So break it down for me.

Bart:

Yeah. There's a lot of technical mumbo jumbo. Right?

Murilo:

Yes.

Bart:

The the Dugby

Murilo:

Yes. You know. Yes. But maybe for the people that don't know. One liner.

Bart:

It's an one liner. It's a very it's a I would say, a OneNote analytics engine. So it can can query your data in a typically a non distributed way. This it's not meant it's it's not built for, like, typical analytics you do in a distributed way like Spark. We lot of people know Spark.

Bart:

The debate takes a bit of a different different project. It's it works on a single note, and it scales basically vertically. Like, if you have more memory, you can do more. You can do stuff quicker. So the debate is just that.

Bart:

Like, it's it's an analytics engine, but it can read from a lot of different sources, and it can write to a lot of different sources. And it is also highly efficient. So you would say it's not distributed, so it's not really, let's say, a relevant player here, but it is very much a relevant player. Like, it's setting for a lot of companies. Analytics pipelines, like, I think, 80% would be a fit for the debate, and that's probably still understating.

Bart:

Yeah. But now what the debate is that they have a new output format, which is which they call DuckLake. And what they basically do is that, like a lot of the, let's say, modern data lakes, they write out in a in a certain format, like, in this case, it's PACCAR files.

Murilo:

Okay.

Bart:

Alright. Which is very well known. Yep. The different thing that they do is that they next to Parquet files so in order to understand, like, where in your Parquet files, which data is Mhmm. You typically need a lot of metadata.

Bart:

And you either write that metadata to disk or you have a catalog where you where you keep this metadata, and you need this metadata to basically query the data that you have or to to to insert the new data. And what DuikDB did is and it sounds very simple how I say this now. But what they basically did is that they created a SQL store, like a more or less traditional data database to store this metadata

Murilo:

Ah, okay.

Bart:

Which makes that it's it is becomes very fast to read and write this metadata, which is typically a bottleneck. So your whole data lake becomes much more performant.

Murilo:

Okay. But then it's like, is it still data lake or you actually have an engine for the metadata?

Bart:

You have a traditional database for the for the

Murilo:

Just for the metadata.

Bart:

For the metadata.

Murilo:

Oh, okay. Yeah. So it's a bit of a mix. Like, you still have, like, just files?

Bart:

It's a bit of a mix.

Murilo:

Yeah. Yeah. Interesting.

Bart:

But that basically means that you can you can treat your your your data lake basically as a modern data warehouse like a Cool. Because it becomes very performant to query. And

Murilo:

Yeah. Yeah. Yeah. So if using something like DuckDB, DuckDB normally is for, like, say, like, OLAP. Right?

Murilo:

So more analytics, so it's more reading. Yeah. Right? If you have for writing stuff, you're probably not the best. Right?

Murilo:

Because you so I think that this is the way I'm thinking. Actually, I spent some time thinking about it. Like, if you have a simple application, a POC, you couldn't just have, a parquet file and pretend that's your database. Or, like, even if you have, like, three users or whatever, maybe that's not the best use case because you're also re writing data to it.

Bart:

Well, it's it's I would say when you use it for, let's say, what a typical data lake or a lake house or whatever use is used for, which is large amounts of data Yeah. Then it is efficient in writing. So it is very efficient in writing thousands, millions, whatever amount of rows and and the metadata for that.

Murilo:

I see. I see.

Bart:

But it is not efficient at all if you're just gonna add one row or Yeah. Up to the row.

Murilo:

Yeah. I see.

Bart:

That's Okay. Interesting. But I would say that's not necessarily specific to the lake. Right?

Murilo:

Yeah. Yeah. Yeah. Yeah. Yeah.

Bart:

Yeah. Little lakes or lake houses in general.

Murilo:

Cool. Do you think this was a problem, or do you think this was gonna be like, is this because, I mean, I I hear what you're saying. So it's more performant. You can write the melody faster. But I'm a bit, quote, unquote, surprised that this was that this was an issue.

Murilo:

Not that an issue, but, you know, an improvement point. Like, no one was complaining about it, but when this was released, I I heard a lot of noise about it.

Bart:

Yeah. I think there I think, yeah, it's it's when you explain it, it sounds very simple. Right? Like, basically, just using an actual SQL database as a as a metadata catalog.

Murilo:

Yeah.

Bart:

I think it got a lot of notice because it's very simple, but also because the the base is still very big. Right? Yeah. Yeah. Especially in engineering circles.

Murilo:

For sure. So

Bart:

it's interesting to see these these kind of, let's say, simple inventions that do have a big impact on efficiency. Right? Without, like, we having to reinvent Yeah. The world But they'll to do this.

Murilo:

In a lot of ways, I feel like those are the actual are actually more clever than the very complicated things. You know? Yeah. Like, I feel like if you see a problem and you see a simple solution for it, to me, I give you more credit for it. You know?

Bart:

Yeah. Yeah.

Murilo:

Cool. What else do we have? What else do we have? Let's see.

Bart:

We have something from Mistral.

Murilo:

Yes.

Bart:

Over to

Murilo:

you. So Mistral Aya has unveiled Magistral, its reasoning centric language model, releasing a 24,000,000,000 parameter open vision alongside a more powerful enterprise tier. As the company puts it, and I quote, Magistro is designed to think things through in ways familiar to us, offering transparent multilingual chain of thoughts and 10 times faster replies in LeChat. By open sourcing the small model and touting competitive benchmarks, Mistral positions itself as a nimble challenger to the big LLM providers. So this is the actually, I'm a bit surprised actually when I saw this because this is from June 10, so this is kind of recent.

Murilo:

Right?

Bart:

This is kind of recent. Yeah. Yeah.

Murilo:

And I'm a bit surprised that this is the reasoning model that they have. Yeah. So maybe a recap. Like, reasoning model, it just means that you ask something to the LLM, and then the LLM doesn't just give you the answer. It actually will spit some things out, and then it will use the things that it spit out with your question to actually answer your question.

Bart:

Yeah. Yeah. Yeah. I think that's that's that's a this is, a simple way to

Murilo:

Yeah. Because I I yeah. Because I know the people

Bart:

And, like, spitting something out, like, it's it's the reasoning about your question.

Murilo:

I'm I'm I'm trying purposely to avoid, like, terminology, like, not enterprise. Like, because I know in JGPT, had a thinking thing, and I remember talking to even mathematicians, like, that were very educated and could understand the mathematics behind it.

Bart:

But it basically more or less tries a bit to broaden the context. Like, if I would ask a model who is Marie LoCunha? Yeah. No. Who is Marie LoCunha, the the machine learning engineer?

Bart:

Yeah. If I would ask that. And the reasoning part starts. The reasoning tokens come out. Like, what you will typically see is, like, something like, this person is probably not asking about Milo Cunha, the the famous soccer player.

Bart:

Yeah. So we need we need to look at something else. Like, you you get a bit, like, questions on your question to Yeah. Better guide, like, the the what the answer will be.

Murilo:

And, actually, I think well, maybe it's a bit of a side note, so I don't wanna go too much on that tangent. But I remember, like, early well, maybe two years ago, maybe a bit more, that one way to test LM outputs was to use, like, self reflection.

Bart:

Mhmm.

Murilo:

So it's like the output the LM will give you something, and then you ask, like, is this hallucinated? And they actually could empirically show that a lot of times you could actually sense it's bullshit.

Bart:

Yeah. Yeah. Yeah.

Murilo:

And I think reasoning models, they kind of do that. That's why they a lot of times the output is better because they it's almost like they already do this self reflection Yeah. By itself. Right? But, anyways, so this is the model from from Mistral.

Murilo:

So this is the French competitor. Have you tried this, Bart?

Bart:

I haven't tried it.

Murilo:

You have not tried this, but you're disappointed.

Bart:

Well, I'm disappointed because, like, if you look at the benchmarks

Murilo:

Yes.

Bart:

It basically, on all the benchmarks, scores worse than Deepsea car one. Yeah. And Deepsea car one today is, I want to say, four ish months old. Yep. Around something around that.

Bart:

Right? Four, five ish months.

Murilo:

Which is, like, 70 years old in LLM land.

Bart:

Oh, no. 20, I would say. Yeah. And then you wanna, like like because I think the only worthy competitor when it comes to building a lens we have in Europe is Mistel. In Europe, yes.

Bart:

And then why is it it's it's it's a pain to see that it is so hard to be to release something in Europe that is on par with what today is considered safeguard.

Murilo:

So you think, like, that the data protection Europe makes is is

Bart:

No. I'm not even saying that it has anything to do specifically with regulation, but I just mean, like, we only have one serious player in Europe, and they are not able to Yeah. Build something that is on par with what is today considered

Murilo:

Yeah.

Bart:

The best.

Murilo:

Yeah. That's true. That's true. Even here, so

Bart:

makes me wonder about, like, what is our competitiveness on this going forward.

Murilo:

Yeah. That is true. It's true. It's yeah. It feels like we're behind.

Murilo:

Right?

Bart:

And you could say, like, the the the the alternatives, okay, it's good that we at least have one. Yeah. Right? It's it feels a bit more something built in Europe is maybe safer to trust because it's probably does adhere to all European regulation, and it's easier to for governments, for financial institutions, like, to use these type of services.

Murilo:

Yep.

Bart:

But at the same time, yeah, what does it mean for the future? Because you can only be, like, so much behind if you want to be relevant in this game.

Murilo:

Yeah. It's true. I also well, the other hope you can have is that this is just benchmarks and maybe the vibe checks. You know? It's close.

Murilo:

It's also hard. Like, yeah, we're looking at numbers here, but sometimes the numbers don't reflect the actual user feeling. Right?

Bart:

Yeah. That is true. That is true. Yeah. That's a fair point.

Murilo:

Yeah. And I think on the benchmark here, they show actually all three. Like, they actually like, their medium model seems like it's it's or, actually, how am I am I reading this wrong? So we because in the in the benchmarks here that I'm showing on the screen, they just have 83%, for example, for I'm at 25. 83%, 72%, and 64%, but they are different model sizes, I'm guessing.

Bart:

Yeah. It's a bit hard to read. Are different model sizes. Yeah.

Murilo:

Yeah. So they have my mystro medium three, magistro medium, which is higher, I guess. But the match four match at 64. Yeah. I'm not sure.

Murilo:

So yeah. But I guess the the the what they're trying to say here is, like, yeah, we're doing worse than some things. But for others, if you take the big model, we are we're doing better. But it doesn't feel very impressive. Right?

Bart:

It doesn't feel impressive at

Murilo:

all. Yeah. I was like, never gonna use this shit.

Bart:

Yeah.

Murilo:

Yeah. Pity. Yeah. But, I mean, again, still waiting. Like, I hope I I am hoping that one day someone will be like, I am using Mistral, and it's really great for this or that.

Murilo:

And, like, yeah, let's yeah. Let's see.

Bart:

I hope so so. I hope so. I think the good the the the very good thing about Meso is that it doesn't use an actual open source license on these models. I think Apache two.

Murilo:

Oh, really?

Bart:

Which is, of course, much more closer to an open source license than the one that Lama is using.

Murilo:

Yeah. True. True. What else we have? We have more AI later, more LLMs later.

Murilo:

But what do we have now, Bart? The hidden time bomb in the tax code that's fueling mass tech layoffs.

Bart:

Yeah. It's an interesting one. It's also a bit of a technical one. So I'll, I'll introduce you to the article. Yes.

Bart:

So a little noticed tweak to The US tax code section 174 that took effect in 2022 made r and d costs dramatically more expensive, quietly spurring hundreds of thousands of tech layoffs. One startled executive admitted, I work on these tax write offs and still haven't heard about this. Underscoring how to tax how the underscoring how to change the blindsided companies, large and small. With repeal efforts now winding through congress, the episode shows how obscure fiscal fine print can ripple through innovation hubs and local economies alike.

Murilo:

Alright. Break it down.

Bart:

So there is, apparently, a bill that was designed in, if I remember correctly, 2017.

Murilo:

Okay.

Bart:

And that came into effect in 2022. And since then, we've seen a lot of mass layoffs, which I think a lot of them have been attributed to post COVID reducing after the a hiring spree. Where now there are voices that this new regulation is actually and there are there are it's not really clear cut, but there's a lot of people that think that this regulation also has an impact on why these tech giants are potentially firing in The US and hiring somewhere else. So what it means is that and I'm because I'm not into this this this tech regulated stuff at all, but I'm gonna try to explain. So what you could do before this is that you if you work on an errand r and d type projects, which a lot of these tech giants, like, if they develop something new or a new feature, like, it's very easy to define something as r and d, of course

Murilo:

Yeah.

Bart:

Is that you can you could basically get a reduction on the amount of tax you could pay on your profits. Okay. So how is it calculated?

Murilo:

Yes.

Bart:

Is that normally you can you have your revenue, everything that you make it here, and then all relevant costs you subtract to that

Murilo:

Yes.

Bart:

Where what is left is your profit. Yes. And based on the profit, you're gonna have to pay a tax according to a certain tax rate. Yes. Right?

Bart:

What used to be the case is that for r and d projects is that you could allocate that as a cost for 100% of the cost that you made. Okay. So if you made a €100 cost in r and d Yeah. You could deduct that from your from your revenue.

Murilo:

Okay.

Bart:

Meaning that your profit gets smaller, and you have to pay less tax.

Murilo:

Ah, okay. I see. Uh-huh.

Bart:

And now what with this new regulations that basically that you need to activate the r and d on your balance sheet. So, basically, means that the company is doing r and d towards building some value.

Murilo:

Okay.

Bart:

And the cost for this value that we're creating, like, can't write it off in one go. You need to do it over, I think, four or five years, actually.

Murilo:

Okay.

Bart:

So if you let's say it's four years, normally, you could deduct this €100 from your revenue to get a to get to the to your profit and the tax that you need to pay a lot. Now you can only deduct 20 if it's over four years.

Murilo:

I

Bart:

see. Meaning that you need to pay way more tax.

Murilo:

Yeah. Yeah. Yeah.

Bart:

So you can That's how I understand it.

Murilo:

Okay. So then people are like and because they need

Bart:

to pay more but this is

Murilo:

not new on legislation. Or is this something new?

Bart:

This is since 2022. So it's rather new.

Murilo:

Okay. So then since that, because the cost for R and D are higher, people are just cutting it off.

Bart:

And potentially moving to other geographic locations. And, apparently, like, when you when you read to the reports on this, like, it's it's also, like, caught a lot of the big companies on guard.

Murilo:

Oh, really? Yeah. I wonder how these things happen. Right? Like, they have a lot of people, expensive lawyers looking to these things.

Murilo:

No?

Bart:

I would say. Yeah.

Murilo:

Where did they go actually now?

Bart:

Well, I think there's a there's a lot of these these things going on. Like, there there was some discussion that this is going to Europe, which you have a lot of different things going on at the same time. And so, apparently, this is a driver on this. This, I think, there was probably overhiring, like, just post COVID.

Murilo:

Yeah.

Bart:

We have the whole AI revolution coming up Yeah. Which makes people I think the whole AI stuff today makes a lot of companies very hesitant. Like, it's a bit let's wait and see Yeah. Before we hire new people.

Murilo:

Yeah. Yeah. Yeah. I mean, you see even big companies like meta, like tech companies that have a reputation Yeah. Making claims about these things.

Murilo:

Yeah.

Bart:

And but even then, like, also the the AI stuff, like, the there are a lot of drivers that make that the job market today is, I think, difficult for engineers. And this is very of course, this is very anecdotal, what I'm gonna say, but but I've heard a lot of people that that graduated September, of course, I'm talking Belgium here

Murilo:

Yeah.

Bart:

That are that had a either had to to search very long Yeah. For a job or are still searching. Yeah. And that is I've I've never really known this in the last ten years within this, like, within this field.

Murilo:

Yeah. I think yeah.

Bart:

And and this of course, it's very anecdotal, and it's also especially in Belgium, like, it takes a long time so for all these numbers to get published to actually see, like, what does this do. Like, there's a bad bit of a a lack on the on the on the census data there. But

Murilo:

yeah. Yeah. But if I mean, I also get that feeling. Even though, like, not me personally. Right?

Murilo:

Like, not firsthand for me. I mean, yeah, I've been employed for for a while. But you also see, like, on Reddit, like, you see, like, some posts there. Some people say, oh, yeah. I've been studying computer science, and I cannot find a job.

Murilo:

And I don't know if maybe there's a bias as well because they also read articles about it, and they're a few more at ease to make this post as well. So maybe there's some ripple effects there. But, like, you also see articles of people saying more this is more from The US now, saying that the soft engineering golden era is kinda over that because I feel like few years ago, everyone would say, like, are you studying computer science? You get a job. Like, it's fine.

Murilo:

Like, you're done. You're you're figured out for life. I've even seen posts, I think, already. I think in Belgium even that there was a kid saying, like, I have committed so much of my life for this, and now this is all going thrown away. It's like almost like it was a sacrifice.

Murilo:

Right? Yeah. To me,

Bart:

it's also

Murilo:

a bit the the wrong way to look at it. But but it does it does feel a bit more like this.

Bart:

And let's be realistic with what is happening with AI assisted coding. I mean, if you're a very good software engineer today and you use something like like Cursor or or cloud code or whatever, like, you're you're I mean, you're you're five times more efficient.

Murilo:

Yeah.

Bart:

Sure. More done with more with less time, and that simply results in small teams.

Murilo:

Yeah. And I think also if you're a junior and you don't because I feel like now this like, if your experience really propels you forward. Yeah. But I feel like if you're a junior, you can really it's gonna make you shoot yourself in the foot.

Bart:

I think if you're yeah. Yeah. What

Murilo:

It's almost like giving a gun to someone experienced. Do they know what to do with it?

Bart:

I I would say if you're experienced, it's it's really an enabler. Like, it's Yeah. Makes you very, like, way stronger than you were. Yeah. But I think you get the same effect with someone that is young and is very passionate.

Murilo:

Oh, yeah. True.

Bart:

But if you have someone that is very passive on learning

Murilo:

Yeah. I think

Bart:

it becomes hard. Yeah.

Murilo:

I think it's someone that is curious

Bart:

Yeah.

Murilo:

That will want to know the why of things and try to understand what's good and what's bad. But I feel like the thing is a lot of people that use these AI assisted coding tools, they they use it because it's, like, it's lazy. It's not like it's the op it's a bit the opposite of curiosity. You know? Like, you don't wanna do it, so you just tell someone do it.

Bart:

No? That's one way of looking at it.

Murilo:

Yeah. I mean, that's how I feel. I feel like a lot of the times when the the AI assisted coding is brought up is like I mean, there's even stories. Right? Like, the guy that created a business about it and then turns out there were keys on his on his code, hard coded, and then, like he's like, ah, I didn't know how to code, but now I don't have to learn it Yeah.

Murilo:

Yeah. Because this thing would

Bart:

do for me. It's not too late.

Murilo:

You know? So but yeah. But I agree. I think if you if you if you still have the the developer spirit, let's say, you're still trying to understand things, what makes things good and what makes things bad, I think I think it's it would take some and I think it's a skill as well. I don't think it's just like you need to learn how to use it as well.

Murilo:

I I feel like I don't really know that well still.

Bart:

I think the difficult thing about, at least where we're at today with AI assisted coding, is that it moves very quickly. And, like, I would say every other month, you need to change your tactic to be really at the frontier of where this is going and which and that's of like, that experimenting with new stuff, etcetera. Like, I don't think that is your typical standard engineer. Yeah. I don't I think that's, like, curiosity and new doing to new stuff and failing and and falling on on your nose and Yeah.

Bart:

Yeah. Standing up and being victor like like like, it it it fits it fits a few people, but it's not for everybody. Right?

Murilo:

Yeah. It's true.

Bart:

And but at the same time, that is very much where we're at because it's moving so quickly. So from the moment you learn something, it's two months later is outdated. But you need that experience to really to really build a feeling on. Yes. You really built your tool toolkit basically today.

Murilo:

Yeah. And one thing that I I was talking to someone the other day that there was the prompt engineering, I think maybe one two years ago maybe. Mhmm. But I also feel like even the prompts, right, like, they they evolved kinda quickly. Like, because I remember before, it was like, I I should really say you're an expert in this, and then, like, I hadn't.

Murilo:

But then this time, there were models that were already creating these things, and then, like, it it moves very fast. Right? So I feel like even the prompting because the whole idea of prompt engineering and, like, the prompt management and you can share prompts with people and all these things. But indeed, if it moves very quickly or if it changed so much from one model to another, maybe it's better for you to just kinda just always be descriptive. Right?

Murilo:

But then you don't need a problem management tool for this. Well, yeah, we're gonna talk more about AI later. So because also you mentioned the cloud code. We'll put a pin on that.

Bart:

Well, let's go to that one. Right?

Murilo:

Let's maybe Let's do it then. So and it's my turn. No? Yes. So introducing Claude four and topics Claude four family, OPUS four and Sonnet four promises state of the art coding, extended tool use, and improved memory for multi hour agent workflows.

Murilo:

The post doubts that, and I quote, Cloud Cloud Opus four is the world's best coding model, citing 72.5% SWE bench score and new parallel tool execution by keeping prices steady and expanding availability across AWS, Google, and its own API and tropic claims to cement cloud as developers go to frontier model. This one look more impressive.

Bart:

Yeah. This I would this one, I did try.

Murilo:

Did you try? I didn't try yet. No. Because also, it's like, yeah, traveling all these things, I didn't get as much time.

Bart:

So this was released, I want to say, weeks ago. Maybe maybe three.

Murilo:

May 22. Yeah.

Bart:

And we're we were at a moment where when it comes to coding, because I use this very much in the context of coding, We're at that moment in time, I would say Google Gemini 2.5 Pro was the the considered bit the best for coding. Yeah. This one came out, and I've used the chat. I've used the chat functionality.

Murilo:

Mhmm.

Bart:

Feels very good. Yeah. Some so some of this, the lighter version opens the bigger model.

Murilo:

Chat, you mean, like, just asking questions about the code?

Bart:

Or No. No. I mean, just like like the with the chat interface. Yeah. Like, it's like feels very good.

Bart:

Like, I want to I can't really say that I've noticed a difference with 3.7 Mhmm. Just using it in a chat way. Opus four, I haven't really found like, I don't really notice that there is it's much better than Sonnet when I just interactively chat with it, ask it to make a report or something. Yeah. But apparently, again, anecdotal stuff I read is that it's really excels when well, get like, it's apparently very context hungry.

Bart:

So from the moment that you draw a lot of flow, a lot of different data to it, let's say reports or meeting notes that you're interested in and you ask it to, like, distill insights from that, it's apparently we outshines a lot of different other models. It's apparently very context hungry. That's really when you notice that it's

Murilo:

Context hungry, but also it can digest the content. Yeah. Exactly. Yeah. Yeah.

Murilo:

Yeah. Context.

Bart:

Yeah. And when it comes to coding, so what I actually did at that point so I was using Klein, which is a vSCOD extension where you can use all these these these models. I actually switched to a cloth coat, which is CLI that comes from Entropic. Mhmm. Yep.

Bart:

That uses mainly it's not very transparent, but mainly Sonnet and Opus four under the hood. Okay. And it is ridiculously good.

Murilo:

So how does it if you had to explain?

Bart:

Is it CLI tool? It's a CLI tool, and it's basically a chat interface that has access to all your files and can write files and can search or whatever.

Murilo:

And then you just say, like, on the CLI, the terminal, you say, do this, do that, and then you change the files for you, then you look at the files afterwards.

Bart:

Do this, do that, and then you can say, auto approve every edit or I want to if you Yeah. I want to see every edit that you do before, and then I want to explicitly approve it so you have you have a bit of a choice. So how I typically use it in my workflow, I so I have like, on one screen, I have my Versus Code address, and next to that, a little bit smaller, like, in but in the on the same screen, I have Cloud Code.

Murilo:

Okay. And then you just as the Cloud Code the terminal is changing the files, you actually see this real time on Versus Code.

Bart:

And it is like, it feels very, very, very strong. And one of the things and that I think makes it very strong is that well, it uses the state of the art models Yeah. Opus and Sonnet four. That is one thing. But also, like, it's very good at search.

Bart:

So it's very quickly files find files that is relevant or relevant.

Murilo:

And

Bart:

it's also very good, and it relinks a bit to search at replacing lines in a file. So what you typically run into with cursor or client or whatever when you are editing very large files

Murilo:

Mhmm.

Bart:

It's so typically, like, replaces within file. So you say, okay. That's only that line or those two lines, I want to replace with something else that makes it very efficient. It's very fast. Yeah.

Bart:

If that fails, like, if something doesn't match, then it starts rewriting the whole file. And when it starts rewriting a file of Yeah. Let's say, 600 lines, it it typically goes wrong.

Murilo:

Yeah.

Bart:

And what's ClothCode is very, very good in is that if it fails a time by using their simple replacement file. It starts using other CLI tools. Oh, really? Use a Rekix tool to very get find very precise instructions to where is this original line in the file, and then basically enhance the context to get good replacement in the instructions. So you you I've never had it fail to read to to do partial updates.

Bart:

I never had it to go to write a full file.

Murilo:

So then the CLI of the cloud code is, like, in the terminal. It has access to files, but it also has access to other CLI tools.

Bart:

All the CLI tools, the web, etcetera.

Murilo:

Oh, wow.

Bart:

That's very strong. Very strong.

Murilo:

Yeah. Another thing that I thought it was interesting. So it does feel more I'm gonna say the word agentic here because I don't like to say this because I feel like it's very bloated. But, like, you can actually have tools. Right?

Murilo:

We mentioned now, like, the CLI, web browsing, access to files. So every time I say agentic, that's what I mean, just to be very concrete. But even on the benchmarks here, they had I mean, they are they are benchmarking specifically, like, parallel two calling or, like, agentic terminal coding or stuff like that. Right? Like, which also feels like it's another step in the direction.

Bart:

Yeah. Yeah. But, like, it's more specific. Models, like Yeah. Support this better, and

Murilo:

these Exactly.

Bart:

These capabilities are very strong in

Murilo:

the context of Exactly. And I feel like it's yeah. So, like, this is extended extended thinking with thinking with tool use. So that's the reasoning that we mentioned earlier. Parallel tool execution, memory improvements, blah blah blah blah.

Murilo:

So, yeah, it's a it's a very clear signal, right, that we're walking towards the the more agentic things. That's where they're focusing. And and Tropic really does feel like they're they are catering more for the developer community.

Bart:

And already for a long time. Right? Yeah. Indeed. They're leading the the Gen AI coding space.

Murilo:

Yeah. But even when when they were still at 3.5, I remember everyone was saying, like, yeah, if I want to get shit done, I just go for 3.5. Like, basically, they fly around, but they go to 3.5.

Bart:

And but maybe also relevant in that space because I think, like, they're very good at at at at JNI assist coding, now have their own tool with Cloud Code. But I think the others are not sitting around. So what OpenAI actually also recently did is that they bought Windsurf I saw that. Which is which is also an AI Yeah. AI driven IDE.

Murilo:

It's like a cursor. Yeah. A bit a cursor. Copilot alternative. Yeah.

Bart:

For 3,000,000,000. Yeah. Same. So let's see what they do with it.

Murilo:

Maybe what the what did you think of that the price the price tag?

Bart:

Honestly, all these prices are so crazy. It's hard for me to to reason about these type of it's I think Windsurf is something that is still very young. Right? Like, it's only a few years old.

Murilo:

And Windsurf, is it a Versus Codefork or no?

Bart:

Good question. I don't know, to be honest.

Murilo:

Not sure. I know I know Cursor is a Versus Codefork. Cursor is. Yeah. Yeah.

Murilo:

But I'm also wondering, like, okay. It's a lot of money, and what are they gonna do do with it? Like, what's the it's a big investment, but what do they gain from it?

Bart:

Well, I think instead of just providing a hopefully good model that can be used for coding, like, you can instead of just doing the model, you can have, like, the whole tool chain and then ask for enterprise licenses and

Murilo:

Yeah.

Bart:

Yeah. These kind of things.

Murilo:

I see. I see. I see. I see. Yeah.

Murilo:

True. I also heard the I listened on the podcast that Zed also has some AI features. Yeah. But Zed is interesting because they really built everything from scratch.

Bart:

Yeah. Yeah.

Murilo:

But

Bart:

And I have this feeling that Zed hype is over. I don't hear about all about it anymore.

Murilo:

I don't think so either. It was it was it was short lived.

Bart:

So it's just before the whole AI hype happened.

Murilo:

Because it because it's written in Rust. That's that's why

Bart:

it's that's why there was a

Murilo:

beak, and then it's gone. But yeah. But I I mean, on that podcast, they actually mentioned some because I think the the terminal because they built it from scratch, they could actually have some more CLI control of as well and, like, the the Git stuff, and you could actually make calls. So something that I was also curious about, but I haven't I mean, still comfortable with Curriculum, to be honest. What else?

Murilo:

What else? What else? What do we went through? Ah, DBT Fusion. Is it a

Bart:

Yes. DBT Fusion engine. Founder, Kristin Handy, lays out how the freshly launched Fusion engine, a complete rewrite of dbt core will parse 30 fold will will speed parsing 30 fold, enable local execution, and one day transpile SQL across data platforms. I quote, today, we launched the dbt fusion engine, a complete rewrite of dbt from the ground up, Tristan explains, framing the overall as essential for scale. The road map suggests dbt could slash warehouse cost and free teams from vendor lock in, marking a bold shift from incremental tweaks to deep platform bets.

Murilo:

Yes. So DBT fusion, actually, we it's not something super new because, actually, DBT fusion came from another like, something that they acquired. What's the name? SDF.

Bart:

Yeah. Which I didn't know.

Murilo:

You didn't know.

Bart:

So, like, a mini, mini, mini explanation of DBT. So DBT is a tool that is very big today in the ELT scene, extract load transform. So and mainly focuses on the t of that transformation. So you get if you're raw data, you want to do transformations on that potentially in multiple layers towards different type of destinations, etcetera. DBT is, I think, today a bit the the fact of standard.

Bart:

Yeah. There are all those businesses today, the different standard. So what they did is that they have overhauled their open source engine, DBT Core, and replaced it with a new one, which is called DBT Fusion, and apparently, originates from this acquisition that you did. As SDF, it's called?

Murilo:

Yes. SDF.

Bart:

Which apparently makes the whole execution much faster, optimizes the whole the whole query structure, but also makes it much more intelligent.

Murilo:

And what do you mean by intelligent?

Bart:

Meaning that if you are writing SQL queries, that the that your editor I think they also now published a official vSQL plugin. And meaning meaning that you as developer, like, really gets smart syntax highlighting. So let's say you're doing a transformation that is based on a normal transformation, like, even though it doesn't exist yet, this column, because based on this transformation still needs to happen. Like, in your thing that inherits this, it already knows about that this will exist and what type it will be, etcetera, etcetera.

Murilo:

Oh, yeah. Yeah. Yeah. Yeah. And

Bart:

that also gives very interesting information on lineage. So before, you could see this dataset comes from that dataset, but now you can actually apparently see this column and this dataset b comes from the column and dataset a and maybe also column and dataset whatever. Like Mhmm. We really see detailed lineage between this.

Murilo:

So interesting. Yeah. Yeah. And I think also to just double click what you mentioned about DBT, it's it's it is written in Python. I think now the fusion is written in Rust for performance reasons.

Murilo:

But, basically, it's like you had the the nice, like, the Jinja syntax, the curly braces that it would base Python. The code would just replace that Mhmm. For us an actual SQL that would run-in your Snowflake or Athena or whatever. Mhmm. And then you just execute that.

Murilo:

But because it's really just replacing and running on the warehouse, you didn't have that all the context. Right? So I think now and in the end, like, from what I read, right, like, everything's because it's the speed. Before it was too slow to do these things on your terminal, on your editor. But now that you can do this very fast because you have this fusion with Rust based engine, you can also enable all these things.

Murilo:

One of the things they even mentioned is MCP here somewhere. Like, if they wanna ask your, yeah, agent based chat experience with MCP. So, like, if they wanna say, hey, ChatGPT or whatever model you choose, go do this to that, and then you can actually query very quickly, and you can get the answers back. And before because it took a long time, especially if you have, like, a large project, that wouldn't be possible.

Bart:

Mhmm.

Murilo:

So I think it's like when I heard it, like, there was a lot of noise about, like, ah, okay. Another thing in Rust is, okay, it takes some time, but do you really need it? Is it just because of the hype? But I think when you look at the things that they're mentioning, I see how this quality of life improvement can enable a lot of things as well. Right?

Murilo:

How do you feel about this?

Bart:

Yeah. It's interesting. Like, I think it allows them to to move forward. So I could think it also probably more from a strategic point of view, gives them more value as a company. I think they've been struggling very much to build extra value with our cloud offering around around dbtCore.

Bart:

But I think and it's I think, arguably, that is still very limited value. So you have a lot of company that that are just building on dbt core themselves and and build setting up the tooling themselves around us because it's not really it's not like, the extra feature that they provide are not relevant enough that it really makes a it's a game changer.

Murilo:

Yeah. They're not indispensable.

Bart:

They're not indispensable. And the strategy that they may take here is that, like, if you have a proprietary and what I understand is gonna be proprietary, it's not gonna be open source, but I might be wrong there, is that they really have something that is a significant step up from open source, makes it a lot more efficient, makes it easier for developers to to develop SQL syntax. I mean, that is something that is that results in less time, less errors, whatever. That's that's a smart smart bet. Right?

Bart:

Like, aside from all the efficiency gains you get from it. Right? Yeah. Yeah. DG point of view for them is smart.

Bart:

So, yeah, let's see where it where it turns out.

Murilo:

Yeah. Let's see. Indeed. So there is a and just I just showed up put another page here from DBT Labs that the the DBT Fusion has a new license because it's retained Rust as well as compiled. Right?

Murilo:

So and, actually, I'm not even sure. Can you refer if you have a binary, can you reverse the code or no?

Bart:

Depends where it comes from, and, typically, it's hard to do.

Murilo:

Yeah. I I can imagine it's hard

Bart:

to do. It's always doable, but it's very Yeah.

Murilo:

So maybe it's also part of, like yeah. Because if it's in Python, you can always look at the code. Right? Yeah. Yeah.

Murilo:

Even if you don't make it up, yeah, it's like it's always but yeah. So I think I saw something like, yeah, they have a separate code. DBT court, they say it's still gonna be available. And I think some parts of the code are gonna be available, but some parts are not. Something like that.

Murilo:

Yeah. But yeah. Let's see where it's headed. I mean but I think, again, seems like an interesting thing. And they're not the only parser that was written in Rust.

Murilo:

I would think there was another one as well, like some experiments that people did that they've reached about 30 time 30 fold as well improvement. So let's see. Let's see. Let's see. Let's see.

Murilo:

What else? Scrappling. I think this one is mine. Right? Scrappling.

Murilo:

So this is a a tool that I came across. Scrapeling is an open source Python library that claims stealthy high performance web scraping with automatic adaptation to site to site changes and anti bot defenses. It's it's me it's it's read me highlights that, and I quote, Scrapelling is a high performance intelligent web scraping library for Python that automatically adapts to website changes. More than 5,000 starts since April 2025 release. The project continues.

Murilo:

This product the project shows continued demand for lightweight developer friendly scraping tools that outsmart detection. And I'm gonna put this on the screen. So it's another scraper, basically. But, apparently, it's fast, and I think it's also because I think it's also interesting that comes from the time that LLMs are kinda scraping the web as well. Yeah.

Murilo:

And a lot of people are putting stuff to to stop them, but they also have some things again to to to bypass these anti bot measures. Another thing that I thought it was that was interesting is that they actually have sponsors in the read me for this project. So it was a way for, I guess, for him to to to be able to finance his open source work.

Bart:

That's interesting.

Murilo:

Yeah. Right? And I thought it was also interesting. I haven't had the opportunity to try it, but I thought it was also interesting that something like beautiful soup, you know, something that is very battle tested technologies, you know, are still trying to be reinvented, you know, here and there.

Bart:

Yeah. Yeah. This is maybe also a little bit of context here. So what we try to do is every time, like, also in one of the eight items, like, if, like, an outsource package.

Murilo:

Yeah. Something that people can get their hands dirty with. That.

Bart:

That's the one today. But interesting. If actually, like, it's the you you you posted in our chat chat's channel. It's time I heard about it. It's 5,500 stars, which is significant.

Bart:

And why I was surprised is that I I did a review of, like, all the major I needed a scraper a while back. Oh, really? A few weeks back, actually, not too long ago. And then, like, I googled a lot, but this one didn't come out. Really?

Bart:

Yeah. It didn't I mean, I think they need to work under SEO.

Murilo:

Yeah. Maybe. Maybe.

Bart:

But, yeah, five times four k stars is a lot. Yeah. I think what what you see a lot with a lot of these modern scrapers, which you don't which you do not see with beautiful soup arguably, is that you like, they come in bits like battery specs. Like, they already know, like, a lot of a lot of things you don't need to worry about.

Murilo:

Yeah. Yeah. Yeah.

Bart:

The I think, of course, the the interesting thing here is, like, I think the promise is more like like, we will not be denied because we hide or stealthy about being a scraper. Yeah. It's a bit tough. That's a that's a weird thing that has happened over the last years where before, I would say, something like this would never never have, let's say, or or have passed the the ethics check of the community Yeah. If this is your only premise.

Bart:

But ever since LLMs became very big and there are a lot of companies that are extremely data hungry, like, scraping the web has become, like, a big thing. And

Murilo:

Yeah.

Bart:

There's a big need for us, and we don't really think about, like, are we allowed to scrape? We're just, like, explicitly searching for tools that

Murilo:

Yeah. Yeah. Indeed. I mean can

Bart:

do this even though the counterparty says, don't scrape us

Murilo:

Yeah.

Bart:

As measures in place to block something that is clearly scraper.

Murilo:

And it's like this guy's advertising. Look. We can bypass.

Bart:

Yeah. Exactly. Right. It's a bit I don't know, man. It's a bit weird.

Murilo:

Yeah. It's a bit weird indeed. Yeah. But, like, even, I think, OpenAI, they were doing the whole there was was they swing people? Like, they were using their the CHECH PT?

Bart:

Yeah. The bar is very low today.

Murilo:

Yeah. And

Bart:

and Every major player, like, just basically scraped the whole weapon.

Murilo:

Yeah. And but it's like I remember OpenAI got mad that someone, like, was scraping their, like, or using general AI. And then and then, like, the whole thing with the Ghibli images, you know,

Bart:

like, huge. Yeah. So yeah. It's what is copyright these days? Right?

Bart:

Doesn't exist anymore.

Murilo:

Yeah. But is there is are there trials still ongoing? Can something still explode, or you think

Bart:

There are still ongoing, but they typically take a lot of years. And yeah. Let's see. Yeah. Let's see what comes out.

Murilo:

Let's see. Let's see. Let's be hopeful. What else? What else?

Bart:

And even if you make there, like, the parallel to things like music, copying CD ROMs, games, stuff like music. So, like, it also took years to get to clear law on this.

Murilo:

Yeah. That's true.

Bart:

I think this is, when it comes to, quote, unquote, copyright, like, scraping data. Like, there's not not very like, what is the use of that. Right? Like, it's a Yeah. Yeah.

Bart:

I think that definition, like, what is what is legal and what's not legal, like, it it will take it it has to go through the courts to

Murilo:

Yeah.

Bart:

Get there.

Murilo:

And, yeah, this thing is moving way, way, way, way, way faster.

Bart:

Yeah. Yeah. The technology is moving way faster than the courts will do. Yeah. That is true.

Murilo:

So that's the thing with the challenge. What else do we have? We have something here on Apple researchers just released a damning paper that pours water on the entire AI industry. What is this about? So

Bart:

an Apple research team argues that a leading an Apple research team argues that leading reasoning language models plateau, and they even collapse in complex puzzles, challenging industry claims of true machine reasoning. The study warrants that frontier reasoning models face a complete accuracy collapse beyond certain complexities, calling current performance an illusion of thinking. The finding could temper expectations for next gen LMs and intensify scrutiny of benchmarking just as Apple readies its own AI features.

Murilo:

Yeah.

Bart:

So I put this in because there was a lot of hype, a lot of chatter about this article, basically saying a lot of, like, a lot of different sites, like, from this is bullshit, what they're saying here, to this is why we will never have AGI. Like, think, like, the AI denial, let's really see it as a c. We were right.

Murilo:

Yeah. Yeah.

Bart:

Yeah. Yeah. It's not smart.

Murilo:

Yeah. Yeah. Yeah. So

Bart:

what they basically do is that they have, like the major test that they look at is the I think it's called the Tower Of Hanoi. It's a it's a puzzle. And they a puzzle that you can quite easily increase in complexity. And what they basically see is that's, like, from a certain moment onwards, like, the the performance of degrading or completely collapse collapses. That's a bit about there.

Bart:

Yeah. What is a bit weird, of course, like, what what make what makes this a very, let's say, heavy statements, what they say, like, on this paper is that this illusion of thinking is that because it's Apple that is publishing this. Right? Yeah.

Murilo:

Yeah. Yeah.

Bart:

I'm not sure, to be honest, like, I haven't really looked into it, but it's big because it's our people from Apple. I'm not sure how much this comes to

Murilo:

The actual Apple.

Bart:

Like, is this just another paper that Apple published? Like, there are thousands of research papers, or this is just more formal? Yeah. So I don't know that. I think I'm a little bit skeptical on this because the reasoning is also in the paper is that the the what we see that, like, the good performance on benchmarks that we see is that we question the benchmarks a bit.

Bart:

It is that over time, a lot of these the things in these benchmarks might be in the training data that we have contamination so that we're overestimating the performance of of of these models, of the existing models. At the same time, and that's the the counterargument to this is that, like, the tower of annoyance, like, I mean, there's a Wikipedia page on what the tower of annoyance is like.

Murilo:

So Yeah.

Bart:

So this I I don't think it's really an argument that you can make. It's also a very specific puzzle Yeah. Where you can make the argument, like, it's not really relevant to whatever we do with these models.

Murilo:

Yeah. I think it's yeah. When you try to really measure what it what intelligence really is, it also gets really tricky. Right?

Bart:

Yeah. So I think I think it's it's it's a very big statement. So, basically, what they see what they say is that reasoning is not reasoning. And yeah. Okay.

Bart:

Sure. Right? Like, what what what's in the what's what's in the words. Right? Yeah.

Bart:

Exactly. It's true. Like, I mean, it's still it's still bits and bites. Right? Like, it's whatever is reasoning in that context.

Murilo:

Yeah.

Bart:

But what we do see is that, like, when when you aren't reasoning to a lot of base tests that the that the performance improves. Not all, but a lot of them, it improves.

Murilo:

Yeah. Yeah. Yeah. I think I think yes. But I think it me, kinda goes back to this being a tool.

Murilo:

Right? Like, it's a I feel like the the the the resistance or the promotion that this is intelligence, that this is, like, AGI and all these things is a bit counterproductive. Because I do feel like, yeah, when you look at reasoning and you look at complex tests, we I think it's yeah. Okay. Benchmarks are subjective, but I think we can we can kinda all agree that they they help the performance.

Murilo:

Right? They help with the the the usual

Bart:

model. They're better.

Murilo:

Exactly. Yeah. Right? So yeah. Okay.

Murilo:

The reasoning, maybe it's not this, maybe it's not that, blah blah blah. Like, okay. But it it does it does help you to get a better output. Yeah. Right?

Murilo:

But one one thing I heard as well, and then this is from the the data rich chat, and I don't know if it's this paper. That's why I wanted to ask you that they're also saying that reasoning models, if the task is simple, they could overthink. Yeah. Was it this paper that that also was saying that or was another one? Because I know them think I

Bart:

don't know, be honest, but I think there's general consensus in detail. Yeah. So when I when I translate that to my own efforts in in general assist coding so when you use something like like line or or cursor for that matter, like, can very specifically say for this type of of tasks, use this model. Yeah. Right?

Bart:

Well, I until recently, when I switched to Clotco, what I typically did is for quotes added source generation. I use a non reasoning model. But for, let's say, building an architectural plan, like, building a plan to build a feature Yeah. You use a reasoning model. Yeah.

Bart:

And this typically works very well. So the architecture plan is better. But, also, the code editing is better without reasoning. Because sometimes what you get with new odds reasoning to code generation or to code editing, like, you it starts making too complex, like, patterns, like, things that are not needed, like, elaborating on things that are just Yeah. It's Like, you're over complex.

Bart:

Yeah.

Murilo:

Yeah. Yeah. That I also felt that. But I think that's also the thing, and that's why for me, I I say, like, it's you should really think of it as a

Bart:

tool. Exactly.

Murilo:

Because it's like, yeah, some things is gonna be better than that. Some things is gonna be better than this. You need to learn how to use it. These things are gonna evolve. This is gonna change.

Bart:

I think the important thing of this year, this article, that's also a bit why I put it in there, like, about one. So it's very much hyping. But this week. But, also, like, I think, like, don't take this as an argument not to invest in AI.

Murilo:

Yeah. For sure.

Bart:

Not to invest in understanding reasoning. Like, test all these models and Yeah. And do what works for you.

Murilo:

Yeah. Yeah. No. But then if and I think I I fully agree. Right?

Murilo:

It's like, yeah, maybe this is not actual intelligence, but if you're not gonna use it, it's like it still helps you. Right? So true. True. True.

Murilo:

True. True. I also wonder because Apple, they really fumbled the whole Leia story. Yeah? Yeah.

Murilo:

Like, they really fumble. And I there was even an article that I haven't I think I I skimmed through it that they were saying that they would be very critical of Apple because they refumble the Apple intelligence thing. They roll it back. And maybe they're saying, like, maybe Apple needs to maybe acquire a company that's doing well, but it's been very anti Apple as well thing to do. So but they would be very like Apple has had Siri for so long.

Murilo:

Siri has never been, like, like, the solution to all your problems. Now they had the Apple intelligence thing that was also fumbling a lot with the notifications and all this thing. They had to roll it back. And now we are 2025. We see all these models, and Apple hasn't like, yeah.

Bart:

Yeah. That's a fair fair point. I think

Murilo:

And now we see this paper about Apple criticizing AI, which feels a bit Yeah.

Bart:

That's also a bit how it's by so many communities being perceived. So this is just Apple trying to defend it.

Murilo:

Yeah. It's

Bart:

Why they're not better. It's like I think, like, this is an interesting point. Like, Apple just set the keynote. There are some interesting stuff coming out of that when it comes to AI as well. Maybe something we can touch on next time.

Murilo:

Yes. Maybe something we can dive a bit deeper, and I'll also look for the paper as well. And then

Bart:

Because I tend to be and I also already said the companies. I tend to be more positive on Apple in this. But we'll Yeah. Let's discuss it.

Murilo:

We'll discuss it next time. I'll also think a bit more before I come more prepared. And the last one here.

Bart:

The last one is sad news.

Murilo:

Yes. Bill Atkinson Atkinson dies from cancer at age 74. John Gruber reports that pioneering Macintosh programmer Bill Atkinson passed away at home on June 5 after battling pancreatic cancer. His family shared that, and I quote, he was at home in Portola Valley in his bed surrounded by family, remembering him as a remarkable person. Grubo calls Atkinson perhaps the most essential, quoting him, coder in the original Mac team, noting his innovations in Quickdraw, Mac Paint, HyperCard, still shaped the software today.

Murilo:

Yep. Yeah.

Bart:

That's well, rest in peace.

Murilo:

Indeed. Rest in peace.

Bart:

Bill Atkinson apparently

Murilo:

Yeah.

Bart:

Played a huge, huge, huge role in the very, very early days at at Apple. Mhmm. Question mark, would our MacBook look the same if it would not have been part of it? Probably not.

Murilo:

Probably not.

Bart:

So the things that he that he worked on is also he developed a lot of efficient algorithms to write graphical interface to the screen. Something that really, like I mean, the shaped Yeah. Apple who they are, of course. I also worked on applications like MacPaint and something which I find very cool, HyperCard. Don't know how to do Which so HyperCard was this a sort of like a bit like you can make a slide deck, but you could add functionality on these things.

Bart:

Like, you could have a button that you could click and it would go instead of the next slide, it would go to something else. Like, you can really

Murilo:

make Yeah. Yeah. I see.

Bart:

Make through this logic that it like, it was rich enough to really build up applications.

Murilo:

Oh, okay.

Bart:

Cool. A bit like maybe people from our age, like, would use have used the Microsoft Access. Yeah. Like, there's a database, and you can build some of visual stuff on top of that to to have your own application. A bit like PowerApps.

Murilo:

Yeah. Okay.

Bart:

And you still have some very cool that's also I played around a little a few years ago. You have some some cool open source implementations of HyperCard.

Murilo:

Oh, really?

Bart:

Because HyperCard upsell, think it it was released when I was only a few years old. Oh, really? I never used it myself when I was there. But but, yeah, I think it's I mean, someone has been so

Murilo:

Influential. Right?

Bart:

Influential in in in in something that well, I and you as well, I mean, it comes to Mac products. Yeah. Apple products use so much. I think it's worth it.

Murilo:

For sure. No. It's good that you're to bring this as you bring this as well. I think also, yeah, the I don't know why. I mean, maybe it's always just looking back, but you feel like the contributions back then, they I don't know.

Murilo:

I feel like it's it's so it's so easy to do things today.

Bart:

Yeah. Yeah. Well, it's actually maybe maybe a a listening tip Yeah. To start with to to stop with. I'm thinking now where there was a recent quite recent, like, in the last two weeks.

Bart:

There was a Steve Ballmer interview, I think, by Neil Patel of Decoder, but I'm not 100% sure. But I think it was on Decoder. Mhmm. It was super interesting. Like, I would recommend everybody to listen to it.

Bart:

Yeah. But also when you hear him talking about the early days, and that's a bit like the same days that Bill Atkinson really built this stuff. Like, you didn't have like, you when we talk about developers now, we're in love with, the house is over, there's a developer. Right? Yeah.

Bart:

And then you have a lot of neighbors and other developers. Yeah. But back then, it was basically, like, you had IBM IBM and you

Murilo:

had Yeah.

Bart:

Microsoft, and then at some point, you had Apple. And but in between those, there were not a lot of developers even existing. Like, you're in a very few Yeah. It's true. Developer groups.

Murilo:

Yeah. It's

Bart:

true. So it's a completely different field than, like, the a lot of the things that we take for granted today still had to be invented like Yeah. How do you write efficiently? How do you draw a rectangle on the screen?

Murilo:

Yeah. It's true. True.

Bart:

So

Murilo:

Yeah. It's everything.

Bart:

As apparently, like, the the you worked also on the an efficient algorithm for a rounded rectangle. So a rectangle is round

Murilo:

With the at rounded edges? Yeah. There's a lot of stuff. Like, even, like, today, yeah, I feel like I can kind of build the end to end app kind of myself. Yeah.

Murilo:

Yeah. Right? I can use the cloud. I don't need infrastructure. I don't need to know the cables.

Murilo:

I don't need know anything. Everything is really abstracted. There's a lot I mean, and now you have the the AI layer to it. Like like, yeah, it's you know, like but looking back, like, at people that really needed to like, they need to rack servers. They need to understand.

Murilo:

They need to move the thing. You know? Like, it's it's another level.

Bart:

Yeah. Yeah.

Murilo:

Yeah. Right? So, yeah, when I think I when you think of these people, like, and the contributions and the the the impact that it has. Right? Like, any someone, like, one bright person that really moves things forward.

Murilo:

Right? Exactly. So yeah. True. And I think that's it for today.

Bart:

That's it for today. What did you think about this new form? Is this the time that we do this?

Murilo:

I think I think it's good. It's, like, 80 no. I'm 70% there, I would say.

Bart:

Yeah. I think we should like, the the the parts that we read from the article should be slightly shorter.

Murilo:

Yeah. I think so. Yeah. And I think maybe we need to be able to reorder them a bit Yeah. So we can, like, kinda more naturally Logically.

Bart:

Like

Murilo:

Or just cross it off Yeah. If we if we already talked about it. But I like it. I think it's, again, like, it's every I think it's a step in the right direction as well. It's also good because sometimes we read this throughout the week.

Murilo:

So if you have little summaries, like, it's Yeah. Reminders. Like, yeah. Definitely. Yeah.

Murilo:

Definitely. Yeah. So so true. Cool. If anyone has any thoughts as well.

Bart:

Yeah. Yeah. Give us feedback. Give us feedback. That'd be cool.

Bart:

Rate us on wherever you let you're listening to your podcast.

Murilo:

Yes.

Bart:

Quickest way to do this is just to to click on the five stars.

Murilo:

Yes. Yeah.

Bart:

It's just The quickest way to

Murilo:

do that. It's just there. Like, it's closer to most people are right handed. Yeah. It's closer to the right.

Murilo:

It's just like it's the if you want yeah. And I know your time is valuable too, so that's why we are just leaving

Bart:

this. No. Yeah.

Murilo:

So that's it. Anything else?

Bart:

That's it. We'll see you all next week.

Murilo:

Yes. Now it's a weekly thing.

Bart:

That's a weekly thing.

Murilo:

Yes. Your next week, Bart. And do we have an ultrasound? No. It's the same.

Bart:

Oh, and so we do this weekly thing. Right? And we're also gonna maybe we set it when we did the prerecording, the one. We're gonna also have some guests.

Murilo:

Yes.

Bart:

I actually have an interesting one. Oh, really? Yeah. Yeah. Oh, cool.

Bart:

That is working at the government on AI legislation. Oh, don't spoil it too much. No. No. No.

Bart:

Yeah.

Murilo:

Alright, Bart. Thank you.

Bart:

Thank you, Mariela.

Murilo:

And thank you for checking for listening. No. Maybe it's too much. Yeah.

Bart:

Don't don't look at the video from last night. Just listen. Just listen.

Murilo:

Alright. Thank you. Talk

Bart:

to you later. Ciao. Ciao.

Murilo:

Let me do the There's

Bart:

no popping out. Now this is more of, like, an intro thing. Right? Yeah. This is more of an intro.

Bart:

Can you reverse it?

Murilo:

Yeah. Yeah. Exactly. Ah, okay. Next time.

Bart:

Okay. K.

Murilo:

Cheers.

Creators and Guests

Bart Smeets
Host
Bart Smeets
Mostly dad of three. Tech founder. Sometimes a trail runner, now and then a cyclist. Trying to survive creative & outdoor splurges.
Murilo Kuniyoshi Suzart Cunha
Host
Murilo Kuniyoshi Suzart Cunha
AI enthusiast turned MLOps specialist who balances his passion for machine learning with interests in open source, sports (particularly football and tennis), philosophy, and mindfulness, while actively contributing to the tech community through conference speaking and as an organizer for Python User Group Belgium.
Apple ‘Illusion of Thinking’ Debate, DuckLake Lakehouse & Magistral AI
Broadcast by