Podcast Introduction
Robert Curran: Thanks for joining us for another episode of the Appledore Research Podcast. My name is Robert Curran, and I am a consulting analyst with Appledore. As ever, we’re here to share insights on the transformation of telecom in the cloud, network automation, and AI era. If you enjoy today’s podcast, make sure to follow us on LinkedIn and Twitter. Now enjoy the show. Welcome back. In this week’s episode of the Appledore Podcast, Patrick Kelly is joined by IBM in a discussion on the progress and obstacles in quantum computing. In particular, what and when will we see the first practical applications of this amazing technology? The discussion is a fascinating insight into the path from pure research to business value and IBM’s approach to finding it. Enjoy the next 20 minutes.
Quantum Computing: Exploring Market Use Cases and Practical Applications
Patrick Kelly: Hello, everyone. In today’s podcast, we’ll focus on quantum utility and some of the more prominent use cases of the market for the use of this technology. It’s my pleasure today to welcome Ahmed Akhmani. He’s an IBM quantum industry partner. Welcome, Ahmed.
Ahmed Akhmani: Hi, Patrick. Nice to meet you. Today, we’re going to focus on four key topic areas. First is recent advancements in IBM quantum computing. We’ll take a look at some of the approaches and methods to identify high-value use cases. Then, we will dive deeper into the quantum computing solutions that will likely become commercialized. And then, we’ll finally wrap up on potential applications in the telecommunications market.
So I’ll start out, Ahmed, you had given the industry a briefing in late March around what IBM is doing in the quantum computing space. One interesting thing I found is that you’ve had two breakthroughs that promised to speed up the market adoption of quantum computing. The first was introducing a novel error mitigation technique to tackle noise interference. The second milestone was the development of a new error-correcting code, which essentially laid the groundwork for some of the advancements. Can you tell us where we are in the era of quantum utility, where quantum computing is used for more commercial applications?
Ahmed Akhmani: Yeah, sure. IBM published our roadmap in 2020. Actually, we updated it from last year until 2023 to show our progress. From 2016 to 2023, we spent a lot of time increasing the size of our quantum computer to more qubits, and we reached a level where we had enough big quantum computers to do some meaningful experimentation. So last year, we did the first experimentation, quantum utility experimentation, where we showed with this error mitigation technique that a quantum computer could do computation faster and beyond brute force computation. This is a very important milestone because, at the end of the day, the example we tried last time in 2023 was really an example where no classical brute force methods could compute or find the exact solution.
After the paper we published, many other classical experiments tried to approximate this result. So, I think we have reached the level where we can say that quantum computers can have the capacity to have computation beyond brute force with this error mitigation technique and the size of the quantum computer we have with more than 100 qubits. This is for now, and we are seeing more and more actual experimentation in the field, more on the scientific side, I would say, but we are also entering and seeing more and more experimentation on the industry side. And this era of quantum utility, we believe, where we will leverage error mitigation and other techniques that will come in the next year or two called circuit netting, where we integrate HPC and integration, will help us to start maybe to see some use cases in the industry side where we can have initial application in this area where we call the NISQ area.
Starting in 2029, as you said, we are doing parallel work on error correction techniques that will integrate into our hardware. This error correction technique will help us to solve other problems where the algorithms need fault-tolerant quantum computers.
Patrick Kelly: Okay, excellent. I know that IBM classifies quantum computing, simulating nature, advanced mathematics, and optimization. So you have these broad categories from material discovery in simulating nature to anomaly detection in some advanced mathematical areas. IBM has developed this model. It’s around the IBM quantum accelerator. Could you tell us more about the model and how it aligns with some of the customers you’re working with?
Ahmed Akhmani: Sure. Actually, this module or this approach is called a quantum accelerator. I would say we started that in a different version five years ago. Based on the lessons learned, we changed this approach to collaborate with industry partners to help them to be quantum ready, but not just quantum ready, really to help them to start and build the internal capability from the quantum and the integration of quantum in the classical world to reach some quantum business value in the future. So, in summary, what makes this approach unique is not just a technical prototype or POC we try for a few months but a roadmap.
It’s a holistic approach that uses quantum hardware’s different business and technical components. These are the three approaches. The first approach, which is business, is helping our partner and the enterprise to have a use case roadmap for quantum. Defining and identifying the key applications for them. This is very important to understand what is feasible and what is not because quantum really doesn’t apply to all use cases. You need to understand where it applies in your context and what value it is feasible or not. So we answer this question with them. We help them get educated and understand what quantum is for on the business side, which also helps us find the use cases. On the technical side, experimentation is important because it’s the only way to reach understanding or build applications. So, we do iterative prototyping with the technical team and find where quantum applies. And in the last year, actually, we started because we are entering this quantum utility, we start to scale this type of prototype to 100 plus qubit experimentation and understand how quantum is integrated in the classical workflow.
Patrick Kelly: Interesting. Over the past five years, IBM has identified 100 or so potential quantum applications across many industries. One of the things that you called out in the briefing in March was three specific use cases in different sectors. The first one was addressed around last-mile delivery routing. Then, you had another use case focused on cutting emissions. The third use case was around reducing financial fraud. Can you give us a deeper dive into some of the quantum use cases that will effectively become commercialized in the near future?
Ahmed Akhmani: Before that, I think it’s important to understand which type of use cases will be disruptive for the industry. We believe that the disruptive use cases are the ones that are complex from the computation side and have very high values. We can think about industries like those that have a research and development arm where, as you said, quantum chemistry would be very important to find new materials or tracks.
But also, there are other complex challenges in optimization, for example, in businesses with a lot of complex data, and it’s difficult for them to understand this data. This industry includes finance, automotive, life sciences, logistics, and telco. So these are the types of industries we believe will be disrupted in the future. That said, really, most industries will be disrupted. This question is about timing, cost, and benefit because you really want to use quantum, where quantum brings a lot of value.
So to answer your question about the type of application we see in the near term the near term here is not months for us, it’s still years. You gave the example of fraud. We work a lot on fraud with different banking industries. Fraud is an important use case because it’s a complex problem, and finding complex patterns in the data is very difficult because the number of fraud cases is limited. So we don’t have a lot of data. The false positive is very high in the industry, about 90%. And if you think about the value and the opportunities, it’s in the trillions, two trillion dollars. So it’s huge. So, any improvement in fraud is important. We also see fraud becoming more complex in the future with the GNI and other types of patterns that constantly change. We did some experimentation with our own department. We have a safer payment solution. We showed that there is value in integrating classical and quantum. And by the way, most of the algorithms in the near future will be hybrid classical and quantum.
In the second experiment, we did another experiment with HSBC, where we tried novel algorithms to improve the quantum classification algorithm that we were using. So, we believe that machine learning or quantum machine learning will be one of the first applications in the near future. That said, it may not be the most disruptive one initially. It should be more incremental because we’ll use it with classical. So, this is another reason we think fraud is a good use case. Now, after we identified and found the right approach and algorithm to integrate classical and quantum, the real question is one thing, and we are working on it. But we have to think with some banks about how to make this real, how to run it every month or every time we need to run it. And there are still some limitations we are working on.
How do we integrate that? First thing, we cannot apply this to all types of fraud. For example, real-time fraud detection is impossible right now. Let’s be very clear. Quantum doesn’t allow real-time fraud detection, but it could be used as a secondary tool, for example, to reduce false positives. It can be used in batch mode. And this is possible. And this is important, I think, to understand. So, it can be used in payments, but it’s very difficult to use it right now on credit cards, for example. So this is the type of insight we find when we work with companies and understand the full workflow because the experimentation quantum could apply and could have value and increase the accuracy. But if you put it in a workflow and you think about all the other constraints, we have other constraints that make it not applicable for now. But if we think smartly, we can find ways or what type of fraud we can apply quantum for. As I said, we believe this will be one of the first times we will see real applications in banking. And we can imagine that if we think about fraud, we can also think about anomaly detection in any other field. It’s about classification.
Patrick Kelly: You made an interesting point there, Ahmed. You said it’s not necessarily real-time, so you wouldn’t use quantum technology to detect real-time fraud. Can you give us a sense of how it would be used with conventional computing? And then, I think you noted two trillion annually for some of the numbers you mentioned. In the UK, you noted that there were $2 billion in fraudulent transactions. What reductions could we expect to see in fraudulent transactions using quantum computing?
Ahmed Akhmani: It’s a difficult question. As I said, there are many use cases when discussing fraud. So, I think our sub-use cases, one of the use cases that is interesting in the near term, are really reducing false positives. Or if you look at a fraud workflow, there is a lot of manual work or working on disputes when someone doesn’t agree that there is fraud or things like that. All the investigation is done manually. This part of the workflow can be reduced. So, it’s not just about reducing the impact but also reducing the internal cost of investigating fraud. This one could be the first because it doesn’t need real-time. Now, we can run a program, let’s say, every day and find the subset of the ones that are classical false positives. We think of it as a second check. Then, we can dramatically reduce the number and increase the accuracy. So, this is one example that we think can work with the technical constraints we have on quantum soon.
Quantum Computing in Telecom: Network Optimization and Anomaly Detection
Patrick Kelly: Let me shift gears to the telecommunications industry. I noted that the IBM quantum team is working on specific use cases related to things like network traffic routing, anomaly detection, and contextual customer segmentation. Can you, without naming names, give us any indication of the work that IBM is doing in the telco segment with some of the customers and what the more prominent use cases are that are gaining some traction through the use of this technology?
Ahmed Akhmani: Yeah, so for telco, there are, as you said, three key use cases. One of them, which we believe is still long-term, is an optimization one, and where there is a lot of value, which is network optimization, including antenna placement and self-optimization. And the real benefit there is really what quantum can bring to optimization. If you think about any network, and the telco network is really complex, if you think about local optimization versus global optimization, it’s very difficult to do a global optimization of the network and do it regularly. It’s very difficult. So, we decomposed the problem and used local optimization, sub-network, or other things.
From the moment quantum in the future will allow us to have this global optimization, then the opportunity is huge because, in some optimization problems, we know we are far by 15 to 50 percent from the optimum. We cannot prove it. We cannot find the best solution. But we know with some experimentation that we are really far. So there is a huge. Many telco companies are interested in that because the opportunity is huge, even if it’s long-term. In the short term, if we think about improving network security by identifying better predictions of anomalies, threats, and cyber-attacks, this is exactly the same as fraud.
For sure, the industry is totally different. The data is different. The patterns are different. However, the algorithmic side is not far from the classification problem we discussed on the banking side. We believe that reducing false positives and improving the accuracy of such models will greatly help. Network optimization and anomaly detection. This is the one that we are discussing a lot with the telco companies.
Customer analytics is less, but we believe this is a good use case. If you think about the segment of one on making contextual classification of your customer and using different data, finding the right correlation to find the next best action for your client. For sure, there are classical solutions. They are doing a great job. But if you want to push it a bit more and make it contextual and find other correlations in telco, it can apply to telco, it can apply to airlines, and then quantum will bring value, too. But as I said, this third one is less focused, maybe because the classical solutions are doing a good job right now.
But that said, in addition to these three use cases, actually something that is not use case-oriented, but really where the huge opportunity for telco resides, and we are talking with every company almost every week or two, it’s really telco being a provider of quantum cloud center computing services. They are a cloud provider right now. And this is the real opportunity that nobody’s talking about. It provides quantum technology to others as a cloud provider and connects all their other solutions with other things. And we start to see it at the lower level; I would say they are still not a quantum provider at a small level because nothing is in production yet. However, we can see in Germany that T-System is a cloud quantum provider, and they are building an ecosystem to do that. So we are having a lot of discussions with other telco companies about that. I think there is a very important opportunity there that the telco is interested in.
Patrick Kelly: Good point. So, it’s an extension of some telco companies’ managed services.
Ahmed Akhmani: Exactly. At the end of the day, the reality is that the computer would be a hybrid by definition, with CPU, CPU, and GPU. So, if you want to be a player in future computing, you need to have QPU. It’s by definition. And I think telco companies understand that and start to prepare themselves for that display.
Patrick Kelly: OK, Ahmed, thank you for your time. We look forward to inviting you back as the industry evolves to give us progress on where things stand, not only in the telecommunications industry but in some of the other industries we discussed.
Ahmed Akhmani: Thank you very much. Thank you, Patrick, for today.
Patrick Kelly: Join us next time for more insights and conversation on telco transformation.