Delska delivers high-performance cloud, data center, and network solutions with a personal touch—combining Baltic-level reliability with truly responsive, custom support. From ultra-secure Tier III facilities to rapid hardware rental and managed services, they power businesses that need speed, safety, and scalability on demand.
In this episode. we explore the transformative impact of AI on data infrastructure with Rihards Kaletovs, CTO of Delska. We discuss the significance of data sovereignty, sustainability, and the advantages of regional data centers. Delska’s new 10-megawatt data center in Latvia is highlighted for its green technologies and efficient cooling systems. The conversation also touches on the future of data centers, AI’s role in enhancing efficiency, and the importance of compliance and certifications in data security.
Watch the podcast here:
Listen to audio only here:


Learn more about Delska.
Interested in appearing on the SourceForge Podcast? Contact us here.
Show Notes
Takeaways
- AI is revolutionizing data infrastructure and operations.
- Data sovereignty is essential for compliance and security.
- Delska is expanding its data center capabilities in the Baltic region.
- Sustainability is a priority in modern data center design.
- Local data centers can offer better latency and compliance.
- Certifications enhance transparency and trust in data management.
- Energy shortages are prompting a shift to regional data solutions.
- Innovative cooling technologies are crucial for efficiency.
- Water usage in data centers is a significant concern.
- The Baltic Highway improves connectivity for enterprises.
Chapters
00:00 – The AI Revolution and Data Infrastructure
01:13 – Delska’s Evolution and Rebranding
04:04 – Preparing for AI Demands in Data Centers
06:45 – Shifting Focus: From Cloud to AI Performance
09:29 – Regional Providers vs. Hyperscalers
14:11 – Impact of GDPR and Certifications
18:51 – Data Sovereignty in the Age of AI
21:57 – Energy Shortages and Regional Data Centers
25:04 – Innovations in Delska’s New Data Center
27:16 – Sustainability in Data Center Operations
28:16 – Building Sustainable Data Centers
31:31 – The Ethics and Regulations of Sustainability
36:13 – Water Usage and Efficiency in Data Centers
40:39 – The Baltic Highway: Enhancing Connectivity
47:07 – Future Innovations in Data Center Technology
Transcript
Beau Hamilton (00:01.176)
Hello everyone and welcome back to the SourceForge Podcast. I’m your host, Beau Hamilton. Now listeners, as you know, we are in the midst of an AI revolution that some refer to as the new electricity. has the potential to be as revolutionary, if not more so, than the internet and will have very profound effects on our society. Most of the attention has been, I would say, focused on the consumer facing applications, namely the chatbots like Gemini and ChatGPT.
I would say a little bit less on the infrastructure and the data required to make all these AI advancements possible. Now today’s episode will focus on the latter. It will kind of weigh in on the bigger industry topics of data sovereignty and sustainability. I’m joined by Rihards Kaletovs, Chief Technical Officer at Delska, a company that has been quietly building one of the most advanced regional data infrastructure networks in Europe.
And it’s pretty exciting. They’ve just actually unveiled a brand new 10 megawatt data center in Latvia, which is part of a growing ecosystem designed to give organizations full control over where their data lives, how it’s powered and how it scales. So we have a lot to discuss. So I just want to bring in Rehards and get right into it. Rihards, welcome to the podcast. Glad you could join us.
Rihards Kaletovs (Delska) (01:13.842)
Thank you. Good evening or good morning. It depends where we both are. Nice to meet you.
Beau Hamilton (01:18.688)
Right. I believe it’s my morning and your evening. The power of being connected over the internet. I love it. Now, as we already established, your company is one of the biggest data center operators in the Baltic region. And you have been in operation for the past almost three decades, 25 years or so. Now, I understand you also recently rebranded bringing together two major data centers into one sort of unified name, which I think is pretty exciting. What was the driver behind that decision and how does the new Delska identity reflect where the company is headed?
Rihards Kaletovs (Delska) (01:56.444)
Yeah, you are completely right. We are already in a business for 25 years or actually even more in both regions. Now we have expanded in Baltic region from Latvia to Lithuania and looking also a bit further to join other companies to our group. What was driving? If we see the market for the last five maybe even more years, the more customers are looking for expanding their business, not only in local region, also going to Europe, Central Europe, Western Europe, and so on. And quite an important part here plays the communications.
We need to establish secure and stable communications. That’s why we joined two companies. So we expanded our portfolio in a network services. We brought up a very serious part of the network infrastructure. It’s a DVDM network, which is usually very important part for the operators or big data centers to scale out the infrastructure. And also we brought up the
data centers into multiple locations so we can work on disaster recovery solutions for the customers and strengthen our portfolio from this point of view.
Beau Hamilton (03:28.246)
So you have kind of combined powers of each sort of company into one sort of unified entity. I also just will say from a marketing standpoint, I think the new name is much better. It rolls off the tongue easier. And so I think that is important for what it’s worth. So you have the unified name that helps, I would say, better position your company for the future. What would you say Delska is doing to prepare its data centers to handle the high-performance compute-intensive demands of AI and all these next-generation workloads?
Rihards Kaletovs (Delska) (04:06.856)
I would start a little bit further on this topic as this AI thing is going on already for, I would say, intensely for the last five years. Of course, it has a bit longer routes, but still. Quite many operators, data center operators, are struggling for the last years to catch up on AI requirements. What we all were trying to do is retrofit old infrastructures, trying to get more electricity and work on the cooling solutions. However, it is not enough as we see for today. We also see the big hyperscalers and other big players are building new infrastructure because AI and HPC computing is actually different beast. They require completely different scale and it’s impossible to fit the old data centers. We see it in Europe during the summer, especially the summer seasons when we overheat in many data centers.
So we decided to build a completely new modern data center which will be able to deal with AI workloads. We brought necessary electricity. The data center was designed as a modular one brought up now the first module 10 megawatt size and we are ready to scale up to 30 megawatts. We are already designed and prepared the electricity for that. And also we are working very closely with big OEMs and companies like Nvidia to understand the architecture of upcoming AI progress in the next few years. So the data center is able to provide the liquid cooling for most of the AI hardware and platforms.
Beau Hamilton (06:14.744)
Really interesting. So I want to get into that the energy shortages conversation in a bit. But I just some of the, you’re mentioning the scalability and the modular kind of components that you’ve rolled out recently with your data centers. Can you maybe talk about or elaborate on the current approach to your data centers in regards to how you handle performance and how that has differed from maybe the approach you had a decade ago before we had some of these AI needs and requirements?
Rihards Kaletovs (Delska) (06:48.552)
I would say the first driver for all the data centers and I think even for the customers is efficiency because it directly affects the cost of the platform of the solutions and so on. For many years we are working on the data centers always trying to optimize and squeeze every bit of the electricity we can and cool as efficiently as possible. Now everything as I mentioned already shifted. Now we have to go to the liquid cooling because the kilowatts what are produced by AI actually it’s not even kilowatts, it’s megawatts on even bigger scale. And we had to resort this design. We worked with dedicated design companies on this part and optimized infrastructure specifically to handle these AI loads. But at the same time, we also are able to provide the collocation space for, let’s say, normal customers, because business consists not only from the AI side of view, is also, we receive the emails, we work with files, we are able to handle also these workloads.
Beau Hamilton (08:11.566)
Right, that’s the thing, right? You have kind of this, before the AI boom, it was mostly concerned, I would say, with the cloud infrastructure, the needs of the cloud-based storage, handling emails, like you mentioned, different retrieving and sending and retrieving files. So it’s definitely pivoted to more of a compute-intensive, performance-intensive tasks.
But I would say yeah, that’s I mean as the saying goes like that’s where the rubber beats the road so to speak where you know AI is cool. It’s the fancy term Everyone’s you know talking about it, but it’s sort of useless without the the back end like how the infrastructure the horsepower behind it.
So I want to focus a little bit on the cloud. So when people think about the cloud, they think about maybe some of the tangible aspects of being able to retrieve data remotely with just internet connection. They think about maybe some of the global tech players that are making that possible, right? You think of Amazon and AWS, Microsoft and Azure, Google Cloud. The fancy term for these players is global hyperscalers because they provide all the assets, all the things required to make those services possible. You’ve got the computing, storage, networking, all that stuff on a really large, massive scale.
Now, I’m curious to hear your thoughts on why organizations should consider alternatives like regional providers instead. Are there advantages that come with staying closer to home and having that data stored locally as opposed to maybe a data center in Texas, for example? What are sort of the pros and cons there?
Rihards Kaletovs (Delska) (10:04.838)
I guess there are few aspects how we can look at it. Of course, there is a good place for hyperscalers and the big clouds. It depends on the workloads and the companies. If you are very big enterprise, some of the workload should be shifted out to the, , to the big hyperscalers and so on. But usually the big companies keep the data, some data locally, because the other aspect is compliance. Quite often in a region there are rules, especially if you talk about the European Union, there are quite strict rules on many types of data, I don’t know, like medical data, the government data, and other personal types of data. It should be localized and be carefully monitored what is brought to the cloud and what is still under control and locally in the particular country.
Another side is, the big players, have very strong platforms, they have the strong policies and they have to do such approach because managing so big infrastructure requires very precise operations. However, when you come to a little bit smaller companies, startups or small medium businesses, they usually need some agile approach, some flexible cooperation with the operator because they need some exceptions for their services or hardwares and so on. And that is a very strong point when you can work with smaller providers.
They are eager to find this cooperation point and work with customers. That’s really our experience, how we work with the customers. We like to use this word, this personal touch in the IT world. So it’s nice to call and you know there is a person who knows exactly your solution, not just call center where you wait for many hours and then answers.
Beau Hamilton (12:24.878)
Right. Yeah. Yeah. Having that, that support is huge. I feel like that does tend to get lost when you have, when you have a bigger organization, unfortunately. And then, you know, another thing that comes to mind is just maybe the latency issues. know the, big tech platforms, they might have, I don’t know, more, more money to throw at this problem. But, I think having kind of a closer proximity, from a company, a startup with the data center improves the latency of being able to send and receive files and information.
But what you’re saying first and foremost, which was the compliance issues, I think that’s what really stands out for me, right? And then I think along with that, you have sort of maybe the bigger picture of the political uncertainty, right? So Europe has GDPR, which sets rules for processing data, which other countries like the US don’t really offer. have state-led data protection laws, and GDPR, I think, has pushed many companies here in the States to implement stronger privacy and security regulations because the internet crosses boundaries and state lines and whatnot. But there’s still no comprehensive federal data protection law here in the US for the modern age, which is a little bit frustrating. But I think the political uncertainty with what’s happening in the States is pushing Europe and other allies to look into more domestic solutions.
I also want to get into, so the data sovereignty aspect is interesting because of all these compliance standards in Europe. And I think you also have the role of certifications, which are crucial to improving data center operations. My first question before we get into some of the data sovereignty aspect is, what impact has Europe’s GDPR and other regulations had on your business? And then how would you say certifications fit into that equation?
Rihards Kaletovs (Delska) (14:24.808)
The GDPR was, I would say, not too much affecting our business directly as a company, at least from an internal point of view. Yes, there were some cases with the customers. We had to also negotiate or rethink how we work with the data, especially if we provide the management service for the platforms. But that is quite a small aspect of what touched us for other customers, yes, it was much bigger challenge to understand what the data they have. It’s not so much where it is located from GDPR point of view, but what you have and what you are doing with this data. In normal conditions you are not looking at such aspect of your everyday work.
And actually, the certifications comes here very handy because they bring the, or transparency of the processes that are implemented in the customer’s company and in your data center provider company or other operators or services you are using. And if these processes are controlled and monitored on a regular basis, you can be sure that your data is maintained and worked with exactly as we agreed and it complies with the local laws or even like European laws and so on. bringing actually these certifications even further, not only on GDPR side is how the company works.
Because we are the data centers, but we also are providing customers the point of presence across Europe and we are working with other companies and actually with other data center operators. And quite often we contact some company which maybe are not so much certified but still are on the market. And we see this struggle, how maybe sometimes chaotic is this approach to the problem solution or if you have to work with these services on every day, it’s really difficult sometimes to get some feedback or for SLA from the company. Therefore, I would say that this compliance with some ISO certification, security certifications is really important if you want to be sure that your data is safe and maintained correctly.
Beau Hamilton (17:21.398)
Right. So it’s, sort of standardizes the order of operations. It makes everyone sort of play by the same general set of rules. And I think the certifications on top of that just help continue to refine that, that, that process. And like you were saying with every company has different data needs. They work with data in different ways. So just being able to be certified in a way that allows you to work with their needs, I think just improves the overall relationship. So that’s an important part of the puzzle.
Rihards Kaletovs (Delska) (17:54.374)
Yeah, I would actually even add that it brings also some freedom to the customers in some aspect. Just like as example, whatever country you are in, if you go to the McDonald’s, you know you will get exactly the same food and the same taste as in any other place. And here is the same. You are working with company which is certified according to your requirements. You can freely move your services if you, I don’t know, for example, don’t like the pricing or any other reason. And you move to the company which works on the same principles with the same certifications, you will get the same level of service. There will be no surprises or some degradation on SLS or something like that.
Beau Hamilton (18:39.82)
Now, what are you hearing from businesses in regard to this topic of data sovereignty? Like, have you seen an uptick in the maybe concern level of where their data lives? Just given this, you know, this big kind of AI gold rush of data centers and infrastructure and then also with this sort of maybe political uncertainty and these macro conditions. you seen, I imagine you’ve seen a lot more businesses just raise their concern level around where their data is being stored.
Rihards Kaletovs (Delska) (19:21.672)
Yes, I would say that AI brought next level to this uncertainty because previously the data was quite common and we used to know what type of data we are working as I mentioned, the emails, files, whatever, there’s pictures. But with AI, there are many security risks because if you expose your company’s data to some AI platforms or something, there is this unusual way how this information is slipping out of your control. That’s why I would say recently the AI market or this industry is going a little bit local.
We already see it’s not at all are using only public services the enterprises specifically are bringing up the local AI clusters so the company can work with their data on a safe manner because they are sure where it is and who is working with this data. I would say this also in Europe specifically, it is a big point for GDPR systems because let’s say we use the ChatGPT, it’s very common in these days and you are sending your PDFs, your emails to it and then trying to bring some summarization or financial data.
But during this process, the data is flowing to the different countries. It’s processed there and then brought back. Maybe there is some footprint left in that location or if there is some down times on a particular location, you are not controlling how the data flows. That’s why this data locality and also the AI locality, let’s call it that way, is quite significant. And that’s why you have to look sometimes in a local region for infrastructure, which is able to provide you this premises or hardware or services to bring your own AI up and running.
Beau Hamilton (21:45.26)
Right, yeah, so it really just gives you the business owner more peace of mind, I would say, of just being able to control the data. So another big issue we’re seeing is the growing power and capacity shortages, both here in the United States and in Western Europe and elsewhere in the world. These data centers, I mean, they just require vast amounts of power and resources.
Is there a correlation between energy shortages and an increased interest in these regional providers like Delska. In other words, are there energy shortages or maybe are these energy shortages effectively causing businesses to really think about diversifying into secondary markets maybe as a risk management or mitigation strategy?
Rihards Kaletovs (Delska) (22:35.388)
As we just talked about this, bringing the data closer to the customer, this locality, this is one of the aspects how we can split this big energy requirements for AI loads. And the other thing is it is really difficult to bring so huge load and power demands in a single place, even if it’s hyperscalers. They are still the guys who should bring up the infrastructure in a single point and of course bringing so big power, the cooling, We heard also the noise to the neighbors. It’s a big issue. Usually during the history of IT, I would say, after such big concentration of some information loads or some other technical things comes the period when the industry understands we have to scale out. So we have to bring some loads a bit further, especially in the AI case, it’s quite easy because the latency specifically to the customer is not so much important. Like those few milliseconds, you don’t feel when you work with the chatbots or some AI systems. The latency, of course, is important inside these data centers where everything works inside the AI platform. However, to the customers, we can bring this infrastructure closer, split it, because not all the data centers or all these platforms should be very, very huge. We can scale out and bring it to the edge and bring it closer to the customer.
So, yes, we feel the need for this. We work with the customers which are smaller and don’t need so big scales. Of course, the big customers should work with the hyperscalers or bring up their own infrastructures. But other customers have to understand their needs and find the appropriate size of the infrastructure and location where to put it.
Beau Hamilton (24:53.218)
Now, I think this would be good time to mention that you, at Delska, is just or is about to open up a brand new data center, a 10 megawatt data center in Latvia. Is that correct? I think that’s a pretty significant milestone. Maybe you could talk about what makes this facility really stand out in comparison to some of the other data centers in the area or in Western or Northern Europe and how this helps customers in the area.
Rihards Kaletovs (Delska) (25:24.102)
This data center is one of the biggest in our region. In global scale, it is, I would say, normally sized data center because it’s a different from country to country. What’s interesting is that this data center was designed and actually built based on very green technologies. We try to bring the most innovative systems like magnetic levitating chiller systems so that there is a very low resistance. are very smooth on operation scaling.
Beau Hamilton (26:31.448)
I will just say that, you know, I think it’s very honorable, I think to see the focus on sort of these green initiatives and trying to make the data centers as efficient and environmentally friendly as possible, right? Because when you have this kind of supply and demand issue where you have all these companies racing to incorporate AI and then having to build the backend infrastructure to support it. And I think, because you look at data centers require a lot of the data centers to keep up with the demand. They have to use these gas turbines, which are very, environmentally, they leave a pretty big footprint. So to see that you’re using, I imagine you’re also using solar. What are some of the other kind of renewable energy sources you’re using?
Rihards Kaletovs (Delska) (27:41.544)
Yeah, if we look at the sustainability topic, as I mentioned, we are trying to bring up the most efficient technology from the cooling perspective, all the low power and very flexibly adjusting chiller systems. And from sustainability part is quite interesting is the gas which we are using to cool it.
Quite often, especially in all the data centers, the gas or refrigerators which are used are quite old and they have very big CO2 footprints if we compare it to the modern gases. So we are using the gas which is called like 1, 2, 3, 4 something ZY I believe, which is equivalent to the CO2 1, 2, 6 ratio. The older gases are at scale 1 to 2000, 1 to 10,000. So the ratio is dramatic in the thousands kilotons if we compare it. Also bringing up such a data centers, we are not looking only on the IT side or the cooling side. We are also looking how the building is brought up. We are building and certified by the LEED standard, which ensures that during the building process, we are not harming the environment and we are using, let’s say, green as possible the materials and also the building inside, also, the office part is built according to most efficient rules not to waste the water or bring any waste problems and so on and so on.
One of the aspects in our region, I believe maybe even a bit broader approach, we are using the green fuel because the main aspect of the data centers, why we need it? Because there is redundant electricity. If there is some grid outages, which could happen, it’s not normal, but it happens. During these moments, usually the generator starts to work and provide the electricity to the data center. And the fuel which is used in the generators are quite big emission from CO2 perspective. We have worked with the company Neste and they are developed in, I believe in Finland, laboratories.
The fuel they call it Neste MY, which is 98% renewable fuel. So even during the outages, we are going green way. And in normal operation, of course, we use the fully green electricity. That’s how we manage to stay on this zero emission path.
Beau Hamilton (30:50.062)
That’s great. I didn’t realize that you also used renewable fuel with some of the gas generators. But I think having that big picture, right? Everything from the building, building the actual infrastructure from a biological, green standpoint, you’re making sure you’re considering the locals and the local environment and everything from just when you’re breaking ground, I think that’s good to consider the impact there, but also from the renewable energy, the renewable fuel.
Is this something that you’re doing out of the belief standpoint, from an ethics standpoint, from your own company’s philosophy? Is it something that’s being regulated from a government standpoint, or are you seeing specific demands from businesses saying, we want to work with you, but we really value sustainability, so we really prefer that you adopt some of these standards. What’s kind of the reason behind moving towards the sustainability aspect?
Rihards Kaletovs (Delska) (31:56.188)
Yeah, there is, I would say three or four aspects of it. For sure, we are looking to our environment, to the world, and we really want to save the trees and not produce the CO2 emissions. Second, we see and know that the European Union regulations are coming. They are already in a quite long development cycle, I believe, like 10 years.
But it’s very close when the regulations will hit the market and there will be some restrictions. Maybe not too harsh, but it’s better to prepare now. And third, and I believe it’s maybe even the most important point is the financial part of this.
It’s not only about the saving everything, but if we try to understand how to efficiently reuse this heat, it will lower the cost of the services. It was quite interesting recently I was participating in a conference in Poland and one of the main topics we were talking on one panel discussion was the heat reuse for the, for the local house and hot water heating. Because the data center in general, what it is, we are using, for example, one megawatt of electricity and we are fighting to reduce how much we need to consume additionally to cool everything. But this one megawatt is lost in general. If we work with the local municipalities and in our region it is already started, we are working with our local government.
We are building the connection to local utilities and we will bring this heat to the local citizens. We will heat their houses, we will heat the water and it will actually dramatically reduce our costs. If you look from the efficiency point of view, there is this parameter called PUE, which is efficiency of the data center and before these days, usually the one is the most efficient data center. There is no such thing at this moment. Usually it’s 1.2, 1.4, the bigger number, the worse. But bringing this heat to the citizens, we will go below one. At some moment it looked like crazy, but yes, it is actually going below one from some perspective of calculations.
Beau Hamilton (34:48.994)
Huh, yeah, so you’re capturing that energy that would otherwise be just lost and you’re making good use of it. That’s really interesting. makes me think of, I remember at the Consumer Electronics Show a number of years ago, I was talking with a company that essentially, I believe it was a French company that would, their whole business concept was placing little data centers, computers into people’s homes. And then the excess heat that was generated would heat their home, maybe would flow into the water heater. But essentially it was kind of a cloud-based computing that was spread across people’s homes, which I thought was a really neat idea. It’d be interesting to check in and see how they’re doing. But yeah, really fascinating.
Rihards Kaletovs (Delska) (35:34.92)
Yeah, by the way, on the same topic is just recently, we remember the big boom of the Bitcoin when everyone was bringing up the very hot computers. And there was actually the companies who are building the teapots, which inside was the Bitcoin part. And you can heat your coffee or water at home. And at the same time, you are getting some money back by producing the Bitcoin. Actually, we can see this, that there is some similarities on that perspective.
Beau Hamilton (36:08.075)
Yeah, it’s a win-win. get your coffee’s paid for and you get some heat, you get some Bitcoin, and you get some coffee. That’s living in the future right there. That’s peak humanity. Okay, the last little topic and part of the puzzle with the sustainability is the water usage, which I don’t believe we touched on just yet. Maybe you can just share on some of the strategies you’re using to reduce water consumption and then just improve overall resource efficiency in regards to how much water is being used. Because that’s a lot of people don’t realize water is just as crucial as electricity in a lot of ways with some of these data centers.
Rihards Kaletovs (Delska) (36:47.164)
This is also a very hot topic. Actually, at the same conference we talked and especially in Western Europe or a bit more South part of Europe where the climate is a bit hotter. And it actually from just simple physics perspective is harder to cool down the data centers and the data centers have to use water.
It’s called adiabatic cooling to make this process more efficient. So you are fighting wasting water and efficiency or electricity spending. And you have to balance, still quite a big amount of tons of water is evaporated. The problem is not so much that water is not, poisoned or something, it just evaporates and all the utility systems how to produce additional water and in some regions the water problem is quite significant.
For us it’s much much more easy, are much further to the north and we don’t need water for cooling. In our region outside is during the year, like I would say, 75% of the year we are using free cooling. So we are not using even the compressors or AC systems. So we are just running the fans and then data center is happy. And during the summertime, our region don’t need also these adiabatics because the humidity in the air is at that level that normal cooling is efficient. So in general, bringing the workloads to the cold regions will make the environment more green.
And if you look on this topic a little bit in the future or not in the future actually, or already these days, with the upcoming AI stuff, the infrastructure is changing. And I would say like in a few, don’t know, five, 10 years when the new data centers will be built by hyperscalers, by us or any other players, the data centers will be built with the technology in mind and none of the regions will need the water because the AI is producing so high temperatures during the workloads that you can cool it. I like to say just open the window and everything will be cool. You don’t need to throw water on that system.
Beau Hamilton (39:39.726)
Interesting. Okay. So that, so you’re saying that in the future, the infrastructure and hyperscalers will be so maybe efficient. won’t, they won’t give off the heat, as much heat, which wouldn’t require as much water to cool it. Is that what you’re saying?
Rihards Kaletovs (Delska) (39:55.912)
They will produce the same amount of heat still because we use the electricity. But they will not struggle how to cool it to bring big farms or redundant. Okay, redundancy is still needed there. But you can cool it with free cooling and or there is evaporative towers or okay evaporative towers it depends on the region. They still use the water. But if we go the free cooling way, it is very, very efficient and don’t use, we can skip the water part and still be efficient. So I believe this is quite a big jump and let’s see how the industry will work with this option.
Beau Hamilton (40:39.192)
Well, it’s definitely given me a lot to think about this. I mean, all the factors that go into a data center and whether it’s the compliance regulations, the energy requirements, the water, supply of water. I’m glad that you are at least from the proximity standpoint where you’re located in the Baltic region, you have the needs of water handled, I would say. So that’s a good plus just from the proximity.
OK, one thing I don’t believe we’ve touched on just yet, and it’s a big component of your business, and I believe it’s called the Baltic Highway is what you’re referring to it as, which is Delska’s high capacity fiber network. It’s, I would say it’s one thing to have the compute power to process some of these AI tasks, but it’s another to really be able to deliver a high capacity of requests and actually send and receive the data required to make all these advancements possible. And I think that’s where having something like the Baltic Highway comes in. So maybe you can talk about this. How does the Baltic Highway enhance connectivity, improve things like latency, and just overall resilience for enterprises operating in the region?
Rihards Kaletovs (Delska) (41:54.844)
As it’s already been for many years that the workloads has a big amount of data. There was big data period, now is AI. AI use even more data and you have to transfer it between the customer and the data center or between the countries. It depends on a business case.
That was also the reason why we joined or made the group of the companies. One of the companies had deep experience in the data centers. Another company had very big or extensive knowledge on the network. And they had the DVDM network across our countries and across Europe. That’s why we joined. And now we can provide the collaborative services for the customers. can join their offices. We can join the locations between the European countries. We can place the customer data in different locations and provide disaster recovery for these customers. And specifically, we are controlling a whole platform and whole environment, including the data center, the communication channels over this DVDM network. So we can provide to customer the vision how all his infrastructure is under control of the same company and brings it together in the most efficient way. That was our vision and it brought actually all these services now together.
Beau Hamilton (43:35.416)
So you have the data centers, you have this fiber network. I think the next piece of the puzzle would be how do you manage the network, right? Maybe you could share what your platform is you use, what’s the cloud-based platform you use, and then how does that sort of bridge the gap and help businesses scale across the region?
Rihards Kaletovs (Delska) (43:56.648)
The communications is one thing, but then you have to attach it to something on both ends. Of course, one point is the customer, but still we need some workloads in the data centers. Sometimes customers need the bare metal or they bring their own equipment, but quite often it’s much easier for the customers to use the cloud services and easier from the scaling perspective and also from the speed of deployment. We are working on a cloud market, I would say the same amount as a data centers, maybe five years less or like 20 years experience. We were worked with and still working with all the platforms via VMware, Hyper-V or many others, smaller ones.
And we collected all this experience during these years. And we see that there is a need for the customers, which we can fulfill. And we have built our own platform, not based on these bigger vendors. We brought our own platform and we provide the features what actually customers are asking from us.
And also what we see, there is a lack on other platforms on the market. the customers are getting very, very, we like to use it, this word like iPhone style, interface or use with this platform. So you naturally understand how this works, how you can do some things inside. And these platforms really scales very well.
We have been on this platform already for three years, so we have passed the initial period. Of course, there were some challenges. We are not hiding that that’s normal. Yes. That was very interesting for us, actually, to bring a completely new product to the market. And we are very happy of the results today.
Beau Hamilton (46:09.464)
Well, that’s super exciting. And I will say you’re definitely speaking my language mentioning the iPhone kind of analogy and then the importance of the UI, right? I mean, we’re all familiar with, we all use apps on daily basis and just having an intuitive layout is really important for getting the most use out of a platform. So I think, and it’s one of those things that changes over time, constantly is refined and gets better and better, but I’m glad that it sounds like you’re aware of the importance of user design and interface. But also just broader, generally speaking, thanks for kind of describing sort of all the different steps of your business, whether it’s the data center, the fiber network, and this cloud platform. I think it’s a powerful mix, you know? And it all helps lead to better results.
I wanted to ask you, you know, your opinion on where things are going. I know you did touch on some of these efficiency gains in the infrastructure in your data centers with regards to water and the energy usage. But looking into the future, whether it’s five to 10 years ahead, I know it’s hard to even look that far in advance, but what are some of the breakthrough technologies or design philosophies that you think will just redefine how data centers handle power and cooling and all the things that high performance computing needs.
Rihards Kaletovs (Delska) (47:48.168)
I’m a bit, , on such approach, at least in a five-year scale. In 10 years, okay, it’s a bit further period than I am not going so far. I don’t think there will be some big changes. would say the big changes are already happening or already happened. Actually, the invention of AI was quite significant change on all the aspects, not only IT, it’s coming to our lives in a different aspect. And this spike of this technology, we are experiencing it now. And I would say we are already on a some, this technology is starting to mature. We see that first AI systems were using, for example, 1,000 kilowatts. Now the same we can do with 100 kilowatts. It’s already 10 times more efficient. And I believe this efficiency will continue to improve and we will get the more and more efficient systems and the products from this big boom coming from the AI systems. I guess this is the most important and the same systems will actually improve themselves and also the data centers.
As just a small example, we as a data center operator also are using the AI to improve our efficiency on the data centers. We brought this technology to our old systems, old data centers in the first place and I would say we got just on first approach we got 8% efficiency gain. For some people maybe it looks like small but when you talk about megawatts or gigawatts these numbers are quite significant.
Beau Hamilton (49:58.134)
Wow. Yeah, really interesting. Would you say the, I don’t know, the most important or the biggest efficiency gains will probably come from the actual sort of AI models themselves, as opposed to maybe some of the infrastructure hardware efficiency gains that you’re seeing in the data centers. Is that right? Is that safe to say?
Rihards Kaletovs (Delska) (50:24.826)
If there will be no new Einstein or some kind of such level of discovery person, I would say yes, the AI, optimization of these systems and also what they are doing with all other platforms, the applications and the software and the features, this will be the biggest gain. Because from the hardware perspective, the physics were are still there, we cannot, , break laws of physics. We can adjust them, find some small tweaks until something crazy will happen. But at this moment, we can make only the AI more efficient and more smart. Let’s use this word.
Beau Hamilton (51:15.222)
Yeah, totally. No, it’s interesting. It’s something I’m going to keep an eye on, right? Especially with some of these Chinese models in particular, DeepSeek, Baidu, the open source models over there are showing to be pretty efficient. And I think you kind of have this cat and mouse with different regions of their AI models and trying to become more efficient, more intelligent.
And so it’s helping everyone kind of up their game. hearing your input from a hardware standpoint and from a data center standpoint is really interesting. It’s given me, again, a lot to think about. So I appreciate everything you shared with us. Now, for those interested in learning more about Delska, getting in contact with your team, where should they go? Where would you send them?
Rihards Kaletovs (Delska) (52:12.648)
The best way to contact is usually the person to person. It’s my personal opinion from our many years experience when you talk to the human. It’s really most efficient way to work on any problem or any challenge or something like that. So that’s why our main idea is this personal touch to all your businesses, platforms and so on. The website is very close to our name. It’s www.delsk.com. It covers all the regions, all the services. You will find the first look on everything. And of course you will see possibilities to chat with us or call us and write emails, whatever it is, let’s meet and work out on your projects.
Beau Hamilton (53:21.08)
Perfect. All right, listeners, you heard him. Delska.com is where to learn more and get in contact with Rihards and his team. Rihards, such a great conversation. I really appreciate everything you shared with us. And I’m just excited to see where all this goes. It’s such a fast growing space. And hearing the data center kind of innovations there is really fascinating. And congrats on the recent 10 megawatt center you guys are opening. I think that’s really exciting.
Rihards Kaletovs (Delska) (53:48.636)
Thanks. Thanks. So, nice to meet you. Yeah.
Beau Hamilton (53:50.976)
Absolutely. All right. Well, thanks. That’s Rihards Kaletovs’s Chief Technical Officer at Delska. Thanks again. And thank you all for listening to the SourceForge Podcast. I am your host, Beau Hamilton. Make sure to subscribe to stay up to date with all of our upcoming B2B software related podcasts. I will talk to you in the next one.