Interviewing Charlotte Verhamme: Belgium’s AI strategy meets the EU AI Act
Hi,
Murilo:everyone. Welcome to the monkey patching podcast where we go bananas about all things Flanders and AI. Today, we have a special guest. I have Charlotte.
Bart:Exactly.
Murilo:You are the attache of digital affairs in the Flemish government.
Charlotte:Yes. And then I'm detached to the permanent transportation of Belgium to the EU.
Murilo:Okay. Say more?
Bart:Maybe it's good that you introduce a bit who you are and what your role is.
Charlotte:I'm working for the Flemish government, but I follow-up the European digital dossiers on European level. And therefore, I'm being basically being sent by Flanders to the Belgian government where we have like this diplomatic post where we hold the relationships, where we have these relationships with the European Union. So in that post, there's on every topic, there's some specialists that work on negotiating the European legislation. And I do the digital files on that one for Flanders. And we work together with other people from other governments who are basically sent to that diplomatic post.
Charlotte:So from the FOD Economy, PPT, FODs BATLANS, foreign affairs, for instance. And there we work together as specialists on digital legislation, basically. And then we try to negotiate, for me, the Flemish position in the Belgian position. And then the Belgian positions, we try to get into the European complete file.
Murilo:Okay. So you kind of defend the Flemish interest within the Belgium scope?
Charlotte:And then
Murilo:And then Belgium interest in the European scope?
Charlotte:Yes. Is
Murilo:it just you or do you have a team of?
Charlotte:For Flanders, it's just me. Oh. But within the Belgian delegations, we are with four or five people. Okay. So there's also a person for Wallonia.
Charlotte:There's one from the Fort Economy. There's one from PPT. And then on cyber, there are also some persons from from the Fot Foreign Affairs who work on those files.
Murilo:So what does a typical day for you look like?
Charlotte:A lot of reading, a lot of emails, and a lot of last minute trying to get the positions aligned, contact with cabinet. Okay. Here our proposals. What do you think? We get feedback back.
Charlotte:Then defend position within meetings of for whole Belgium. And then also, yeah, that's a bit more difficult, but digital affairs category one. Now it gets into complex parts. So, as you know, Belgium is divided in competences. So for digital affairs, there's an agreement on who represents what level in the European level.
Charlotte:So for instance, fisheries, that's completely Flemish affairs because only Flanders has a sea where they go fishing. So that's completely on the European level, that's completely a Flemish competence. So in that council, there's only someone representing from Flanders there. But on other files, it depends. So depending on the competence division, there is more federal presence or there is more regional presence.
Charlotte:So education is more regional presence, but then digital affairs, the agreement was agreed on in, I think 1994 and we have some state changes in the meantime, but that's still the agreement from 1994. Telecom and digital affairs is category one, so the representation is only federal. So I negotiate the position with it, but I'm not often in the council itself. It's only of the specific Flemish files are in the council that I will come along.
Bart:And digital affairs, like, has a number of different topics there, but AI being one, like telecom you mentioned.
Charlotte:Yeah. So telecom is the name, but telecom is no longer you you need to import telecom on the council level as basically being digital affairs. It's no longer, we believe, representing that. It can be it's for instance, the AI act is there. The European business wallets isn't telecom.
Bart:Ah, it's all okay. Okay.
Charlotte:Yeah. The digital affairs parts all fall within telecom. Okay. But it still carries the name the name telecom, but it's not really longer telecom itself.
Bart:And what are the core topics that you focus on?
Charlotte:At the moment, it's the omnibus, the digital omnibus. So it's the AI omnibus that's the main focus at the moment and the data omnibus that I try to coordinate within Flanders. And then again, try to get the Flemish position on the Belgian level. And then on Monday, we go to the council to actually defend the Belgian position.
Bart:And and what is an omnibus?
Charlotte:An omnibus is basically a document where they change a lot of other legislations. So the name Omnibus means that it basically carries a lot of legislations that it changes in one go. Otherwise, you have to change everyone. So it's basically a big package that they created, and that's how you can see an Omnibus.
Bart:The the digital almost that you're working on now, like like, what there's something on the AI act in there? Is it on something on GDPR in there? Like, okay. Well, like, what is the So
Charlotte:the main focus at the moment is is the AI because, as you know, the the entry into force of the AI Act is the 08/02/2026 and currently for high risk AI systems, currently there are some standards that are missing that still need to be published by the commission And as long as the standards are not there, it's basically impossible to comply with the AI Act as a company or as a government to enforce it. And
Bart:standards meaning like how should companies work in the context of the AI?
Charlotte:In in with high risk AI, it's really standards on how the AI system itself needs to work. So you can Okay. Base it's basically gonna be certificates as a product that you see the CE logo that you see on everything around you. Basically, that standard will also be created for AI.
Bart:Okay.
Charlotte:That needs to comply on that standard. But as long as the standards are not there, you cannot get the qualification. You can also not be compliant with the AI Act. So because it's that short of a timeframe at the moment for as well governments to implement it and companies to comply because there is nothing that we can do now. It needs to be postponed.
Charlotte:So basically we are trying to get that postponement, which is quite necessary in order to then, yeah, make it possible for companies and states to comply with the AI Act because at the moment it's not and because the deadline is that fast approaching, there is the incentive to complete negotiations within six months, which for in European time is really, really fast. So normally it takes at least one and a half year and then you have a very fast file of negotiations. Maybe
Murilo:before, just to recap a bit, the AI Act for people that are not as familiar of what it is. Could you just do a quick rundown, like why, yeah, what is the AI Act about and
Charlotte:Yeah. The AI Act is basically a product norm regulation, so product regulation on AI systems. So every product in like your car, they have standards on what they need to comply with in order to be considered safe to go on the European roads. The AI Act is doing the same for AI systems in what this stands, So they have specified AI system by risk value, compare them with fundamental rights. So basically the impact the AI system can have on the fundamental right of a person and then they qualified the system on impeding those fundamental rights of a person and then they classified it by the high risk, unacceptable risk, minimal risk.
Charlotte:So for instance, biometrics is unacceptable risk that you can classify people based on biometric incentives is an acceptable risk and then you go to the high risk that are for instance systems that can be used in settings like education, where there is a real risk of its having an effect on a person's mentality or, for instance, HR. That's that's the main example that's given. For instance, if you have a job application and you do AI screening of the CVs, that's high risk cause there can be some That's kind bias in
Bart:being done a lot. Right?
Murilo:That's being
Charlotte:done a lot. So those systems are gonna be all despite of high risk systems. So then they have to comply with a certain standard. How higher the risk, it can impede the fundamental rights of a person, the higher the compliance standard to get the CE certification or the product norm certification. So that's the idea behind it.
Bart:Okay. So then they what they're working on now is mainly that that high risk
Charlotte:Yeah.
Bart:So category to see, like, what should you do to work around Yeah. Work work with a high risk.
Murilo:And Yeah. Because unacceptable unacceptable risk, it's it's you cannot
Charlotte:do it. You cannot do it. Yeah. So for instance, screening if, like, cameras, just screening of persons all the time, facial recognition on street cameras and stuff like that is an acceptable risk, except for instance, if you work in defence or policing, there are some exceptions built in in those systems but mainly those systems cannot be used, Like scoring systems or, for instance, on persons like they do in China, that's not accepted in the European Union.
Bart:And what interests me, you were saying before, like, we now have roughly six months until we should adhere to these regulations.
Charlotte:Yeah. Yeah. For some, I already have already entered into force.
Bart:Yeah. For the high risk.
Charlotte:For the high risk. And the high risk are the, yeah, the main risks, basically, and the most difficult standards.
Bart:But you're saying, like, with six months is not enough, like, a fast fast discussion takes a year and a half, basically, to get to Yeah.
Charlotte:So in this case, what we are discussing is basically allowing the period that it's not entering to force to be extended in order to have those standards and then be able so that companies and the government itself also are able to implement the necessary measures in order to have the system of the high risk in place.
Bart:And what makes that it takes so long to define these standards?
Charlotte:Well, they are it's it's quite complex. We are trying to define something that's developing every single day. So the experts on it, the eye office is expert there. They're developing it. It's quite difficult to do.
Charlotte:And on the other hand, you've also a lot of pressure from lobby groups. Some experts say that it's basically also due to lobby groups that they get delayed because they don't really want those high standards to ever enter into force.
Bart:Okay.
Charlotte:So there are there is some aggressive lobbying also going on that
Bart:makes
Charlotte:the process go slower.
Bart:Yeah. And maybe to challenge that because, like, it's too short a period to come to a definition to to like, is the current legislative system, like, ready for something like this? Because AI now versus a year ago, like your capabilities look completely different. Like, are you ever gonna get to a definition that is relevant?
Charlotte:Well, that's the thing with legislation. Legislation always is slower than the world around it. So that's just how But
Bart:this is maybe like an order of magnitude different. Right? AI is moving very fast.
Charlotte:Yeah. But that's always the case. Every single legislation, you have the development of some innovation and afterwards it gets legislated. So it's a normal track. But indeed in this case it goes so fast, the inventions go really, really fast and you, the scope is also bothering what was, yeah, a year ago, those seven second videos, they were really recognizable.
Charlotte:Today, yeah, they're fakes all over the social media office. So I think it's more important than ever that we have these tools also on transparency that there is some watermarks on it that you know it's AI generated. So I think it's important that it is there, that we have these safeguards because it is becoming it's really nice to see how fast it's developing but it's also a bit scary if you look from it from fundamental rights perspective, if you look to it from like how people interact with media and social media and how they can be influenced by it. It's quite scary on the other end also. Also with cyber attacks, it's also becoming very prominent.
Charlotte:So yeah, it's a scary, it's becoming a scary world so we need some kind of legislation to make sure that people are held accountable for it and that they also know that if they have a product that's considered safe by the Yaheh, it's it's indeed safe to use.
Murilo:Maybe before I touch on that, because you also mentioned you have experts working on this and as you also mentioned, it moves very, very fast. Are there like experts from, from industry that consults? How do people stay up to date? Because I think even for, for me that I, I work in the field, it's very hard to, to stay really up to date with everything that is happening. How, how does the
Charlotte:Yeah. So the AI office itself is still in development. So they're, they're quite, they're still hiring quite a lot of people at the moment but they also have these expert groups so they call upon experts on specific topics and then they call upon experts from all over the European Union to basically sit in those council and help them develop those standards. So there's a lot of talking going on with experts, with people in the field to develop these standards. But the other hand is then that you also get because there's also lobbying going on, that you also get the other hand.
Charlotte:So they consulting a lot of people. They're trying to get the expertise, but sometimes the expertise can be lobbying, which is also hard to identify.
Murilo:Okay. And the other thing when you're leading the discussion, like, goes fast and it's it's scary, and and we wanna make sure that it's responsible, that people the the systems, the companies, whatever, they're they're accountable. But it's also it's a it's a difficult balance. Right? So a colleague described governance the other day, kind of like traffic rules.
Murilo:Right? In some places you you want to go slower. So there's a lot of rules you have, like, the speed limit is low. And then some places you have way less and you can really move fast. Right?
Murilo:So it's, it's a bit of a trade off. Right? The more you, the more fine comb it is, the slower things tend to be,
Charlotte:Yeah, and that's why we now have these omnibuses because they try to get the balance right. They've seen now that there is like there was a big change with the new commissioner the new commission than the previous commission where the previous commission was more on like green deal innovation, more social. We see now the shift to innovation, getting our economy stronger and this is part of it, to get our economy stronger. They want to make it easier for companies to do innovation and that's why these omnibuses are now in place. We see them not only in the AIAK, there are a lot of, so this is number seven, on the bus number seven.
Charlotte:There's been one on, sorry, there's been a lot on Cookie banners. Cookie banners, but that's also in the
Bart:This digital
Charlotte:is also the digital one, but the first was on sustainable development more like the corporate sustainable development directive and those things to basically get the administrative burden for companies down. That's the main goal of the Omnibus.
Bart:And is that a reaction to like the changes in geopolitical world?
Charlotte:Definitely. More than ever we see the threats of Russia, but we also see a lot of competition. So we did a lot of research on AI, but we are in Europe not that good in quantifying that research. So going from research to actually starting an AI startup, starting a company, we are not that strong compared to, for instance, America or China, which are really hard in subsidizing the AI development. Yeah, in America, it's the winner takes all and yeah, the money that they basically throw it is not really available in Europe or we don't have that incentive or that culture to throw that much money in one investment.
Charlotte:Then what you also see in America is the winner takes all mentality. So they want to grow really, really fast. That's not the culture in Europe. We are more, yeah, we're more on research, more slower development of that and basically to counterreact that, that we don't get in a situation where we are completely dependent on America or on China. Even we now want to incentivize companies to also become big and that there are European alternatives to the American big companies like Microsoft, like an OpenAI.
Charlotte:If we talk more about the way that there are more cloud providers or service providers on digital forms in Europe that we have a European alternative.
Murilo:Yeah. I'm curious on how practically we can do this in terms of legislation and all these things, but before going there
Bart:Well, maybe to to start with the legislation because we I think we get the comments from across the Atlantic that Europe is doing too much legislation as it's slowing everything down and just saying, like, Domino's business, like, a way to contract slowing down things through because of legislation. But I'm wondering, like, why do we even maybe a bit of a bit of a a fundamental question. Like, why why do we even need something like the AI act? Like, isn't deception already not allowed? Like, isn't using biometrics for bad purposes already not allowed?
Bart:Like, all of these things all of the use cases you're talking about that that you classify on high risk, they're probably already you're not allowed to do them. So why do we need yet another framework to do this just because the decision model is now not written in code, but it's a weight model model weights. Right?
Charlotte:Yeah. So the AI act is not really it's not allowed. It's it's basically it's for instance, if you have a car that does not have the right components, it cannot be on European roads because it's just not safe enough to use. It would, if it becomes an accident, can burn too fast or not enough time to, for instance, save those people from out of the car. That's ideology of like, we need to make sure the products we use in Europe are safe for the people that use them is basically the AI act.
Charlotte:It's not, you cannot use AI. It's regulating the product AI. So from that standpoint, it's not, it's making sure that the AI products we use in Europe
Bart:Pretty the solution built
Charlotte:on top of these
Bart:Yeah. Of on top of these models. That is that's what
Charlotte:Yeah. So you're free to invent, but it needs to accompany a certain safety standard. So the AI act, you may need to see as a sort of state safety standard of the product AI, not really regulating AI itself. It's it's it happens to that definitely, but it's a product it's a product regulation, the AI Act.
Bart:Okay.
Charlotte:And you have everything around here has a product regulation. Your shares you're sitting in, the mic you're speaking in, TV screens, everything in Europe has product regulations. And the AI Act is just is gonna be one of them. And it's gonna be new kind of product, but it is one of them.
Bart:But maybe what I'm more trying to say or I think you have a lot of industries, like, let's look at the financial industry, like, where you had let's take the example of a credit risk model. This was already highly regulated. Mhmm. Right? Like, now it's just for the same use case, you're just changing the underlying how it's built.
Bart:You have extra legislation. Yeah. But what you're saying basically is maybe not on all industries. There exist this legislation on based on the risk, and now there is, like, a like,
Murilo:maybe facial recognition is a good example. Right?
Bart:More comprehensive one.
Murilo:Not sure if there was something facial recognition before, and maybe the systems weren't as good or isn't as popular. Maybe, maybe it's like an umbrella. I mean, if I understand well, it's like, it covers a different scope, like, no, it's the same, like there's an overlap in scope, but doesn't mean that the previous scopes covered everything that this one is covering.
Charlotte:Yeah. So what they also, with Omnibus, for instance, what they're also trying to change is previously it was seen as the AI system itself as a different compliance system and now they want to try to integrate it with other compliance systems. It was already a bit in the previous one, but now they make stronger and so the commission claims, I don't think it's literally written in there, but their goal is basically that, for instance, if you have a washing machine with an AI system on it, you need to run compliance for the washing machine And that you already need then the AI system on it is also within one scope being seen and looked at. But the washing machine that you want to put on European markets already had some kind of product norm that they need to
Bart:I see what meant. Interesting.
Murilo:So you you also mentioned before, like, everything we're seeing here has standards, European standards of safety. You mentioned the TV, for example. So if the TV has AI, then there has to be two standards that adheres to.
Charlotte:Yeah. But now the commission wants to make it one compliance standard.
Bart:And the commission is the European commission.
Charlotte:The European commission.
Bart:It's at the European level.
Charlotte:Yeah. It's at the European level, but then, yeah, it's then it needs to be implemented in the Belgian system and you need to have market authorities where you can say, like, here is my AI product or my TV scheme with an AI solution. K. And then you have to get the standard marking.
Murilo:And and then, like, does it mean that it's changing the the the the standard for TVs to include the AI component to it or how does it work actually in practice?
Charlotte:Be honest, I'm not completely sure about it and I also am not sure if the commission is really sure about it at this point, but that's the goal that they want to reach and that it's becoming one compliance standard. Yeah, there are, if you look at the AI Act, Omnibus for instance, you see that there are still an announcement of a lot of guidelines and instructions that need to follow. Those are gonna clarify all that stuff, but at the moment it's not yet available.
Bart:And maybe on the topic of, like, being more or at least same level competitive with the rest of the world, like, omnibus is focusing on, like, easing legislation a bit, making it bit easier for for companies to move within within this framework. The other point of view is also, like, what is the strategy of EU? Like, they are looking at subsidies. How does that translate to Flanders? Like, how does that work?
Bart:What
Charlotte:Yeah. That's also a hard question at the moment. There's going to be another act that's gonna answer the question a bit on the sovereignty within Europe and digital sovereignty within Europe. It's going it's been announced for the 2026 and it's called the Cloud and AI Development Act and there they want to, it's mainly going to dictate how in Europe we're going to approach that which sovereignty ideology we're going to maintain in Europe on which cloud standards we do. Do we do Europe first approach?
Charlotte:Do we make sure other companies from, for instance, Microsoft is still able to put like a big data network within Europe or data centers? It's basically we're going to discuss the answer of what kind of cloud data infrastructure do we need in Europe and what's the minimum we need to foresee. And from that on, there's gonna be probably some subsidies and other projects going on. But within AI, there's still yeah. We have the AI factories, which have a lot of subsidies that are being announced and also strategies.
Bart:What what are these, the AI factories?
Charlotte:So the AI factories are mainly how it works is like big infrastructures on to develop AI computing power. So countries can say, okay. We want to have an AI factory. They get together with a consortium. So it can be companies, universities, they all come together and they create a center for developing AI capacity, like computing capacity.
Charlotte:And there is fiftyfifty approach. So the you say we want to invest this in it and the commission matches that investment.
Bart:And I've I've heard a lot a lot about some in the last year, I want to say, maybe even a bit longer. How concrete is this already?
Charlotte:So for instance, in Belgium, we have an we're gonna have an AI antenna, and I think it's gonna be linked to the, not 100% sure, I think the German one we're gonna be linked to. And that basically means that we're gonna have like an access point to an AI factory in another country where they have the really computing power, the data centers to develop it. In Belgium, we're gonna have some kind of service points where companies and universities can ask for access to the factory in the other country and then get computing time in order to develop their AI tool.
Murilo:Just to make sure I follow, so an AI factory is like a place to develop chips or all these things like stuff for It
Charlotte:can be everything, it's quite wide.
Murilo:Anything to support AI development.
Charlotte:Yeah. AI development.
Murilo:And the AI antenna is, like, when the it's it's like a a link between an actual AI factory and we could be in another country and, I don't know, Leuven. Right? So it's like we don't have something here, but we have a a direct link to an actual AI factory in Germany, France, Yes. Okay.
Charlotte:And so that's how it's to work. And then basically you can buy computing time from we can subsidize it from Belgium that we can say, okay, we're gonna allow so many companies and so many universities to use that access point. And then there's going, probably going to be some financial transactions from Belgium on to the AI factory that actually provides the infrastructure to do that high computing power technology.
Murilo:Okay. Maybe you've talked about incentives and that's the question that I was trying to remember earlier. You also mentioned The US, there's a mentality of winner takes all. With AI in particular, I have the feeling I'm very convinced that like one on one plus one is equal to five, six. Right?
Murilo:You cannot just, like, split things evenly and just put the parts together, and the development is not the same. Right? I think today, if you wanna build the the state of the state of the art models, you you really need to have a lot of like, one person needs to have a lot of funding. Right? So I I believe that if you have, I don't know, a certain budget and you split it into 100, what the the 100 pieces separately can accomplish is gonna be way less than if you had put all the budget into one.
Murilo:Right? Which I think it also goes a bit against the European cultural way of working. Right? Is this something that is also discussed on, like, subsidies or or incentives?
Charlotte:At the moment, not that I am aware of.
Bart:Mhmm.
Charlotte:It's good that you tell me that because that's also something I can take with me to my work as an argument on how we need to look at these subsidies. Yeah. From Flanders, we we have a subsidy network that you can apply to to get funding for AI projects, but on European level, it's more of they make calls via Horizon or Digital Europe program and then you can, as a company or consortium or working together with other companies, apply for that call and depending on how well your application is being addressed, you can get the funding. So it's more we have this project in mind that we want to do in Europe and then they open it and then you can apply to it as an individual company. So it's less of like looking which company are we going to support.
Charlotte:That's not really how we do it, how it's being done in Europe at the moment. It's more of like, okay, this is a need we have within Europe. We need to address some, yeah, we need more data on X, Y, Z to get more data works for AI And then they place that call, and then you can apply. If your company is working on it, you can apply to get funding in order to get that results for the European Union. So it works like that.
Bart:We've we've discussed this before a bit. And and I and I tend to agree with you. I'm also a little bit cynical maybe about the whole subsidy structure. I think maybe it's a bit of a hot take, but I think we're bit too democratic when it comes to giving out subsidies in a sense that it's it's everything needs to be very fair and equitable. And if that region gets that, the other region also needs to get something.
Bart:I think the reality is is that governments, whatever level whether it's regional, whether it's it's national, whether it's EU, I think a government is not good at capital allocation.
Charlotte:Really not. No. It's yeah. In in Europe, you have to we are 26, 27 countries working together to get something done. There is this idea of fairness going on.
Charlotte:So everybody needs to be a winner. Like, for instance, with the AI factories call and the AI antenna call, every country got something. If you look at the results, every country has at least an antenna or so everyone has something. But that's also the bit the European
Bart:But that's super difficult. And like, I think the AI factories is a bit is a good example. Like, then you have something that is there, but it's like, we want to go to war when it comes to AI and you give everybody a gun, which is a data center. But, like, I mean, you don't have Buddhists. Whether you're gonna do it at data centers, right, they're spread across all all but you still don't, not doing anything.
Bart:Like you just have infrastructure at that point.
Charlotte:Yeah. In the Omnibus, there's a call from the commission to work together. That's already good. That's already good.
Bart:Maybe also there, the example, if you talk about cloud, like, like, like, strong cloud providers, we don't really have them in Europe. You may you have OVH. You have Scaleways, and that's that's maybe more or less it. And then I think we've heard this over the last ten years even. There was there was even some project, I think, recall or something around cloud infrastructure from a from a while back.
Bart:But it's also a bit the same. Like, we're gonna invest all over the place, and we hope that something happens. But, like, take all those investments and just put I don't know, is all those investments, a few billion euros together in a for profit company that knows how to allocate that capital. But it's not that's super difficult to do something like that in Europe. Right?
Charlotte:Yeah. That's very politically almost cannot.
Bart:That's a pity.
Charlotte:Yeah, I get it. Yeah. But that's also not, yeah.
Murilo:I think
Charlotte:we are. The culture thing, think. Yeah. It's how Europe is built. It's countries working together, there are still countries and they are also looking for, like their incentive is the people at the head of those country to be elected by their people.
Charlotte:So the incentive to really focus on their country and make sure there's enough money coming to their And
Bart:could we don't do this at Belgian level at the Flemish level even?
Charlotte:Even there, because of the division of competences in Belgium, that's also quite hard to do because you have Flanders who is responsible for innovation, economy, development, Campbell's support systems. You have that and then you have the federal level where there is more of, there's also Bell Sports, so they also have some kind of innovation level there and then you have Wallonia with their own programme, but also those governments, like in the last elections, you have seen the mentality of Wallonia and investment that they find important is different than Flanders, for instance, wants to invest in. And also the investment capacity at the moment is also quite different. Because there is like in Belgium, there is not one deciding power. So every government is at the same level, which is very hard to explain to foreigners.
Charlotte:Or if you work together with people from France, for instance, they don't comprehend it at all that the federal level cannot say, okay, this is what we're gonna do, but that's just not possible in Belgium. So if we don't have an agreement, we cannot say anything. And yeah, that's how the system works. It works and it doesn't work at the same time. I think that's the best description of Belgium.
Murilo:Yeah. I think we're saying this also because in The US, you do see is like I mean, there are there are few players, but they're the one like, they're the ones that really drive everything forward. Right? And I think if if they didn't have the the concentration of funds, they they wouldn't get where they are for sure. Right?
Murilo:And I think, like you said, there are good things. There are bad things. But I think in this one, I think it's it's gonna be it's very difficult to reconcile every like, the the European values and and still like, for example, you mentioned everyone needs to be a winner. And maybe in the EU level, yeah, maybe everyone is a winner, but I think on the global scale, everyone's a loser.
Charlotte:Also in the global scale, you have the same system also going on. So the European level, also in America or in the United Nations, if you do international negotiations, you cannot do the winner takes all approach. Like everybody needs to defend their decision that they have made on an international level. They need to advocate for at home. They need to be able to say this is a positive thing and that needs to be done also.
Charlotte:The US at the moment is a bit different. It's a bit extreme than other years, but that's mainly the incentive. Like also Trump, he needs to, if he signs something, if he made an agreement, like the Russian deal, he needs to be able to sell it at home and that's in every international level that you do negotiating. So the winner, like everybody needs a win in international negotiation is also there. And you see it within America, you have these funds and real risk approach to investments, but I think that's about it.
Charlotte:If you go further from that, if you go further on the international level on negotiation, then you are again in the level of everybody needs to win.
Murilo:Yeah, but I think maybe that's the, it's just a part of the funds. Right? Maybe it's just on the the allocation of funds that that approach doesn't work as well. Right?
Charlotte:But in America, I think it's more of a private approach and less of a governmental approach. Also in America, you have states that work together. Again, they also need to have those wins.
Murilo:And how does it work in China? Because I feel like, I mean
Charlotte:China is more of of the government dictates and they can appoint yeah. In in essence, it's still a communistic country. Mhmm. So the government just say what they're gonna invest in and that's the economy. And they can, as a government say, we're to invest in this particular company.
Charlotte:But in Europe, that's, yeah, that could be possible, but in a sense, then we need to augmentation why it's the best company, why it's no other option. So it needs to be all if you give a subsidy to a company in Europe, if they win a contract to do something for the government, it needs to be as fair as possible. You need to have at least so many applicants. You need to have so many standards, you need to appoint that they are the best for the budget, they can do it. So there's all the criteria that's fair.
Murilo:Does this happen? Like in recent years, like the last ten, twenty years, has this actually happened that like there was a fair process so everyone could apply and all these things, but the funds were more concentrated into one.
Bart:Are you talking about Europe now? In Europe, yeah.
Charlotte:I'm not sure, could be, but there are really big projects. So yeah, could be for instance, I think the Einstein Telescope is maybe one where there's a lot of budget being allocated to process, but for the moment we're going to build the Einstein Telescope, we actually win it and we have those budgets available, then there's again going to be a process to see which for each part of building it, there's gonna be some procedure to make sure we have the fairest competition.
Bart:And maybe something like that is also a little bit easier because it's very in a research domain. Like, it's universities that need to work together.
Charlotte:But if you if you actually want to just dig the hole, you need a contractor.
Bart:Yeah. Yeah. Okay. Yeah. Yeah.
Bart:It's a whole public procurement process.
Charlotte:Yeah. Yeah. Yeah. And that's the whole idea that it's fair. And we cannot just say in compared to China, okay, this company we're gonna invest some, yeah, $1.02 billions in it because we think this company is going to, no, it's need to be, it's a procedure that's fair and it's also to the companies, it's also then fair for them because they can apply for it.
Bart:Maybe moving away from from, like, how do we become competitive? Like, there's also this other movement going on, which is maybe not as painful yet, but there we see a lot of these KPIs that are evolving that way that there will be job displacement. Tuners that are already in it, especially tech sector that have a much harder time finding a job for a variety of reasons, but AI definitely being one of them. Like, is this something that is talked about at European or Belgium or Flemish level?
Charlotte:Yeah. Definitely. There is so if you look at the recent plans from, I think, the Apply AI strategy, there is a real a whole section on skills and AI and how they're going to address it. And there's also the European skills policy that I really want to make sure that there's enough room for people to reorientate them to jobs with more markets. But indeed, it's quite, in IT sector wise, it's quite difficult at the moment because of AI specifically, but they are thinking about it, the concrete plans and there is going to be funding available in the future and those plans are going to be made available over the next coming years, but there is indeed the concern is there and they are being it's being discussed.
Bart:Yeah. Yeah. Okay. And do do they already have some concrete actions?
Charlotte:To be honest, I don't think at the moment. I'm not sure at the moment. So the PlayAI strategy compared with the legislation, so for instance, the European Wallet and Omnibus is at segregation. So that's got to the legislative process within European Union. So we really talk about it.
Charlotte:This is basically being published by the commission.
Bart:Okay.
Charlotte:That's not really a discussion about it. So it's just a strategy from the commission, which they do with the budget that's been allocated to them.
Bart:So that means it's still very early days. Is that what that means?
Charlotte:What it means is that the commission sees it as their competence. Can they can publish this And there's now going to be we're gonna see within calls, within funding opportunities. We're gonna see more concrete measures out of this Apply AI strategy. There are some calls that already been announced in the Annex, so there's already a list available. But when those calls are being published and being brought into the world basically, we're going to see more the effect of them.
Charlotte:At the moment, this is we had some talks about it, but that's it is not really a discussion on the Playa Eyes strategy at the moment. It's more of a plan of the commission on how to address it. So at the moment, I cannot really say a lot about that. It's more more insight than the text itself says because the text is just what it is at the moment. Yeah.
Murilo:Yeah. Well, I'll know the links on the show notes as well if people wanna wanna have a look. The AI act, is it the first is, like, the first in the world, like, kinda to to
Charlotte:Yes. But also, for instance, it's the first in the world of of trying to regulate it and also to try to regulate something quite innovative. But for instance, in California, like a few weeks ago, there's something similar to the AI being published. So there's a bill in California that has some elements of the AI that they also are publishing. So there's a sense of we need to regulate this new innovation, this new industry is also in America there.
Charlotte:So, and you see it more and more coming also in the United Nations level. There's more talks about how to sustainable AI and how to make it more human centric AI. So all these talks are and all the international levels are being talked about right now. And also, I think United Nations standards at the moment, but because the AI standards are higher than the United Nations standards, we don't really talk a lot about them in Europe, but there are also some standards at the end. The difficult yeah.
Charlotte:The United Nations is not really binding, like, with the AI act is really an act and act. United Nations is more it is not from the p five. It's more of a suggestion if it's not general council from the security council. If it's not from security council, it's not, like, it's like, it's it's not legally binding.
Bart:Okay. And you mentioned the the quote, unquote, act that was passed in California. Yeah. It's called an act.
Charlotte:It's a bill in
Bart:How comparable is it? Like, is it that does it have the like, is it the same the same goals? Is it is it more relaxed? Is it more strict? Is it how how does it compare?
Charlotte:It's it's less strict, but the essence is the same. Okay. It's just already something.
Bart:It's also from this concept of low risk up to unacceptable risk.
Charlotte:I'm not 100% sure. I just know that they are also discussing it now and they wish like a few weeks ago. It was mentioned in the AI board as like, see, we have been yeah.
Murilo:The one?
Charlotte:I think so. Yeah. Yeah. It could be that from September 25. Yeah.
Charlotte:It could be that one.
Bart:Yeah. Yeah. Very interesting. Interesting.
Charlotte:So there also so there are dysregulation approaches all over the world at the moment. So it's not only Europe doing it.
Murilo:When you're the first one to do something, maybe it's normal. There's some of these things you need to amend later. Right? And that's what you're saying maybe with the omnibus. What exact like, can you is it concrete enough that you can share a bit of the things that they are relaxing a bit?
Murilo:Or
Charlotte:Yeah. I think the most controversial one that they are relaxing is that you can use personal data for bias detection. So in the AI training, you can start using GDPR, like GDPR lists, GDPR information that are normally that was not allowed to use that, so I was quite against, but now they are
Bart:PII data, mean?
Charlotte:Yeah. GDPR. Yeah.
Murilo:First of that's
Charlotte:Yeah. Pliable for Okay. So that is quite, yeah, that's quite relaxing, the legislation. There are strong structural rules, so it needs to be locally, like the information needs to be locally stored, it needs to be deleted afterwards. So there are some counterparts of it, but you can use personal data basically for AI training models purely for bias detection, but that was not allowed previously and that's also I think the more controversial part of the AI at this moment and then also in other Omnibus, the definition of GDPR, personal data is also being lenient, so first, previously it was personal data is everything that can identify a person, but now it's being more subjective that personal data, if you have a list of data and you are yourself, your company itself is not able to detect who is who based on those data.
Charlotte:It's not considered personal data previously. Like for instance, if you have IP addresses, most companies are not able to detect who the other IP addresses were. Some were, but that was considered personal data. Now that list with IP addresses that you have is no longer considered personal data. And that's where also the most controversial parts of the Omnibus is.
Charlotte:And I think not some people could maybe not
Bart:And less cookies.
Charlotte:And less cookies. Yeah. That's also a big one.
Bart:That is true.
Charlotte:Yeah. So you only so instead of every site you need to click, it's just going to be some, again, it's not really technically, it's not really explained yet, but there's gonna be in the browser that you can say, for instance, I will always accept all cookies and you press on that one and then you never have to press the cookies again. Some, yeah, some also say that's quite a risk that you can go to a more accept or buy policy if you do not say I accept all cookies all times. You have to pay for certain websites to enter, and that's a bit of risk a day.
Murilo:Earlier in the conversation, you you're talking about I think it was what motivated as well the omnibus, and you mentioned, like, Russia, the threat of Russia, when it comes to AI. I didn't wanna interrupt you, but I was a bit surprised because I was starting to mention The US, right, like, the with the big AI. Like, could you just double click a bit there? Like, when you when you say, like, the the the Russian, the AI game, I don't I don't
Charlotte:It's more on the cybersecurity side. They're using a lot of AI and AI tools to get better at getting a cybersecurity risk to launch, yeah, cybersecurity attacks on government, on people. And what we see in the cybersecurity landscape is that there is actually incentive from countries outside of Europe or Russia to more go for the Western countries to get those attacks more in the Western countries and not necessarily on the Russian countries. So it's geopolitical that we see that since the Russian discussion basically that there is the cyber cyberattacks just generally are more concentrated to the rest than compared to the the Russian
Bart:And a lot of misinformation Yeah. Being spread by Yeah. Russian parties. Yeah. Not only Russian parties.
Bart:There was a few, I want to say, two weeks ago that X suddenly started showing the location of where the users were located.
Murilo:Oh, really?
Bart:Without asking anybody. That's not where I have a discussion. That that is suddenly possible. But what it showed is that we're that in The US, a lot of, like, big MAGA influencers, were not even located in The US.
Murilo:Well, yeah, I remember there was a documentary a while ago that was also talking about the influence of Russian Yeah. Pots, trolls, whatever.
Bart:Know? Probably also from other locations.
Charlotte:Also, fun fact, X got recently fined by the commission. Yeah. And their response was throwing off the commission accounts from X. Seriously? Yeah.
Charlotte:Oh, I didn't know. They throw the I think they throw the EU commission accounts of X.
Bart:And they got fined for, I want to say, a 120,000,000? Yes. Everybody at the US government got very angry.
Murilo:Yeah. Oh, wow. I
Bart:hope that the commission stands aground. Yeah.
Murilo:Fines are 70,000,000 under digital.
Charlotte:Service act. Yeah. So the digital service act is actually an act that says that companies are like platforms that are available in Europe need to comply with certain centres of safety. So for instance, they may need to do content iteration to make sure that there's safety use for children, for instance, that's regulated there, that they need to make sure. It's not defining how exactly they need to make sure, but they need to get some kind of standards and show that they try to get some kind of standard of safety on their platforms.
Bart:It's a good thing.
Charlotte:Not doing it. Interesting. And TikTok was also on risk of getting a fine, but they made some leniency, so they
Bart:got some commitments. They actually didn't do that. Yeah. Very very last minute.
Charlotte:Very last minute. Very last minute, but they indeed they indeed did not get a fine. But they still need to make some changes to the platform. But the commission got enough guarantees that they did not get a fine.
Murilo:Yeah. And here in the BBC says that expends European commission from making ads after the the fine. Yeah. It feels a bit too mature.
Charlotte:Yeah. Well, yeah. What do you expect?
Murilo:Yeah. It's okay. Interesting. This is three days ago. So it's very, very recent.
Charlotte:Very recent.
Bart:Maybe moving a bit further, like, like from another point of view, like like, from a start up scene. Like, if someone wants to start building a company that relies heavily on AI or even, like, focusing on, like, building core technology or infrastructure for AI, like, for what can they look at the government for support? Like, does the government support us in in a way?
Charlotte:Are some startup projects from Flanders itself. We have from FlyO is basically the best entry point to to look at. They have some startup
Bart:Yeah. Maybe just to for the listener, what what is FlyO?
Charlotte:FlyO is an agency within the Flemish government, and they support companies within within Flanders to get on all different kind of stuff. So it can go in cybersecurity. It can go if they want to do an innovative project. They can apply for subsidies within those different topics. There are quite a lot.
Charlotte:And then they can get that support financially or they can even get, like, tested or audited to to Vallejo.
Bart:And and what kind of subsidies? Like, what amounts are we are we talking about?
Charlotte:I'm not an expert on Vallejo, I cannot really say. But I know that they are doing they try to do more than just giving the subsidy for especially for startups. So if you are a startup, you can apply for them and you can actually get not only financial support, but also guidance. So you get some structure on how, yeah, where can we go? So there are some experts there that can actually help you support develop the company also, yeah, advice wise and finance wise.
Charlotte:But if you go to their website, have for every topic or more, there is another subsidy and if you contact them, because they have on those topics, have also different experts. If you contact them, they can help you. Yeah. And they're quite open to it, and they are specialized in just giving that support to companies.
Bart:And then maybe in a bit longer run when they're actually there, like these AI antennas or AI factories are also things that that new companies can use to leverage that?
Charlotte:The idea is that they can use it and that's gonna be to the for Belgium, it's going to probably going to be to the AI antenna. How are gonna do it? And, yeah, because the competence division in Belgium is probably going to be a bit like the NCP working. I think it's going to be a bit like that. So that basically means that there is an advice centre for Wallonia, Flanders, Brussels, and that they can guide you through the right funding or the right way to get those computing time.
Charlotte:Which funds are there? There is already a Flemish supercomputer where you also as a company can apply for time to work and their rates are also try to get as democratic as possible, but there's already a Flemish supercomputer.
Murilo:Didn't know that. Vallejo is, was, when was Vallejo founded or when did it start? Like, is it, because I know it's part of the funders artificial AI policy plan.
Charlotte:Yeah. But Vivo existed a long
Murilo:way before. So it's just like they added to, so maybe
Bart:so the AI and Vlio doesn't have anything to do with AI.
Charlotte:Right? I
Murilo:have heard about Vlio in my my day job. Right. And I think it was always related to AI. So I've come to associate Vallejo with AI, but that's also why I was Yeah.
Charlotte:In AI, so we have in offenders, we have the AI policy plan. There are three pillars on the plan, so there's one on really research wise, so they try to research as much as possible. It's more for universities that they and PhD students that we give subsidies in order to get those studies done. Then there was an implementing track, so basically get that research into companies and then they can through Vlyo apply for, they have like Tetra and Cook projects that they can apply to and then they can get that knowledge from a particular research project directly to their company and if they apply for certain tracks, they can also get subsidies from VAIO in order to be able to follow the track and then there's a third policy and it's more like general knowledge, sorry, to the population and stuff. So that's CADEM, VIA, MI.
Charlotte:Those are like centres that they are more directed to like general knowledge on AI.
Murilo:Like AI literacy. AI
Charlotte:literacy, but like more to citizens and companies. So KDM is more focused on company wise. So if they need advice on how to get to how to comply with AI, for instance, it's more KDM. Viya is more general information and Amaya is really citizen focused.
Bart:So there are lot of possibilities. And if someone wants to look this up, best to go to the website of Lyo, v l
Charlotte:Yeah. For the for the yeah. For the policy plan, there is a different website. Yeah. Yeah.
Charlotte:And there is yeah. Just for the information, there's an exact same plan for the cybersecurity.
Bart:Okay.
Charlotte:Especially for cybersecurity.
Murilo:Maybe also this plan was, I don't know, draft. I don't know. Launched. It was launched in 2019.
Charlotte:Mhmm.
Murilo:So it predates the Gen AI hype. It was reviewed, and it was renewed in 2024. Big changes, small changes, like, did the release of JGPT and all this attention, did it influence this?
Charlotte:The first two pillars, I think minimal. I'm not an expert on this plan particularly. I think within the policy, like, the the end, like, more to how the knowledge is, which ones need to get to companies and citizens sense of like more the general support, that pillar we are looking to change a bit. So it's, yeah, the needs have changed from companies when the PRBN program was for launch. Just saying what AI is, is enough.
Charlotte:Now everybody has heard of AI, so it's more the need of the population and the needs that companies have is completely changed since we have AI that's developing that fast since AI Act, so the needs within the supporting pillars are changed within the research part and within the implementation track. The essence is still quite the same, like getting that research done and getting the research to companies. There the essence is still the same. But in the third pillar, I think we're gonna see some changes. Maybe
Bart:we go to the last question. As like, if looking forward for the coming, let's say, two years from a policy or legislation point of view, like, what what do you hope that we get done as Belgium?
Charlotte:As Belgium? Well, yeah, I think that's that I find a difficult question because I'm not 100% sure of it itself. In an ideal world, Flanders is already a really innovative region. Yeah. And I think we can develop the industry more.
Charlotte:I think that was what I hoped And also, yeah, within Belgium that we indeed get those bigger projects done. Now let's talk about the Gigafactory, for instance. I'm not saying that as Belgium we need to participate, but I think it would be good if we as Belgium work together to get at least a part of the Gigafactories because that's where the big developments are going to be in my opinion. Like in those really big research centers where there's a lot of infrastructure to do the data and research, I think there's a lot of opportunities. Yeah.
Charlotte:You see it with in Geneva with not the Eincentoscope, but it's CEREN, for instance, you see there, like the companies that are located close to the CEREN, you see that they get a big win from it. And I think if we make sure that the infrastructure is also here, the necessary infrastructure is here, then we also get a big win out of having infrastructure available for our companies and for our people.
Bart:That's a that's indeed a nice goal to to see that from the investments that Europe and Belgium is gonna do that we can basically reap the benefits. Right? Like, if we have data centers, if we have compute, that this will spur innovations that we would normally not see.
Charlotte:Yeah. I think that's that's the goal that we have enough of the infrastructure itself in Flanders in Belgium that we can compete, not really compete but get the benefits from it and get these innovative companies and also because we a strong SME economy, we're also a strong startup company. There are a lot of startups in Flanders and in Belgium as a whole. So they need to have the infrastructure available close. And I think
Bart:And and what we will see is not necessarily that we will start competing with The US or China on on on AI AI models or on chips, but maybe on specific application domains like we we're in LUVIT today. Like, there's a big hub on on on biotech. Yeah. Again, there's a big hub on on tech. Maybe we'll we'll we'll more compete on those, like, more application domains.
Charlotte:Yeah. I think if from the research validation, we we are really strong in our research department. So we have the IMEC here, which is also a very big part of of the AI industry. On cybersecurity, we have also a really strong researcher here in Belgium. We are we are really high on security by design, for instance.
Charlotte:But the valorization and getting the companies within Belgium itself to know that we are also experts in a lot of topics is sometimes difficult. So we need to get the infrastructure there and we need to get the research domain and the economic partners. We get to need to get them more together in an ecosystem even more than we are trying to do with those policy plans to get that more valorization and more economic development within Belgium and Flanders.
Murilo:Alright. Cool. Thanks a lot.
Charlotte:You're welcome.
Murilo:It was very interesting. There's a I learn a lot of stuff. If people want to reach you, how can they
Charlotte:Yeah. It's quite simple. So it's my email address is [email protected]. So That's a Flanders. I have Flanders.
Murilo:Yeah. Cool. Very curious how everything is gonna go. I think maybe we can do an update from time to time if you're sure sure up if you had a good experience. Right?
Murilo:Like, maybe
Charlotte:Maybe after the omnibuses are doing.
Murilo:Yeah. Maybe maybe after all that. But, I think sometimes we discuss Bart and I we see some things on the news, and I always feel like we we discuss, but I I don't I don't know if we have a very informed opinion, you know, because we're not day in, day out, and we see these things. And I think it's really interesting to to have your perspective on this also, not just in Flanders, but also in in Belgium, you and the world and all these things. So I really, really appreciate it.
Murilo:Thanks a lot. And, any any last words, Bart?
Bart:No. Thanks a lot. Thank It's very interesting.
Charlotte:Nice to to be here.
Bart:So Alright. Thanks, everyone. Thanks, everyone, for listening.
Murilo:Yes. Yes. Yes. And see you all next time. Ciao.
Bart:Ciao.
Creators and Guests
