Transcript
Hi, Mike Matchett with Small World Big Data. I'm here today talking about AI, of course. How do we use AI in it? How do we make it productive? How do we actually get things done with AI? Lots of people have, uh, chat bot and access. Now lots of people are doing vibe coding, uh, vibe, it sort of stuff. But there's still a lot of friction there, a lot of cut and paste, a lot of errors and retries and so on. Uh, but I have devolutions here today who's got a really interesting application and, and implementation of AI within what they do with RDP and, and remote desktop management. Uh, so stay tuned. This is going to be interesting. Hey, Marc. Hey, Mark, welcome to our show. Yeah. Glad to be here. All right, so let's just start with Devolutions. Devolutions. Uh, you guys have been around, uh, doing kind of, uh, uh, I don't want to say small scale, but kind of, uh, in your backyard development of remote desktop management tools that has grown pretty large in ten years. Tell us a little about evolutions in what you guys do. Yes. Well, most people know about evolutions for remote Desktop manager. Uh, so for those that are not familiar with Remote Desktop Manager, it does remote desktop management, as you can see, RDP, SSH, VNC, Powershell. Anything that takes credential needs to connect to something. Think of it as your IT professional, uh, tool in which you would have all of your connection entries to connect to just about everything without credentials in one place. And just make it as simple as double click, and it connects to whatever you need to connect as fast as possible. So kind of like a combination of a password vault where I might have to store all my passwords and connections, but then wrapping it up in the in some tool that would automatically know, once I've configured it to connect to somewhere, I can just click and connect to somewhere, point to somewhere. Uh, And and it people who are managing lots of different remote machines all day long would really, really appreciate this. Yes, yes, you got it right. So there's a big difference between a password manager and a connection manager. So where a connection manager. So we take care of grabbing the credentials and injecting them into the connection that you need to make. Uh, so you don't need to copy and paste your credentials everywhere. Just like you, you click. We take care of everything. It just connects for you. I'm going to I'm going to since you brought that up and we're on the topic of security, then, uh, this is a more secure way of doing things than than what you just said of copy and pasting credentials. Everywhere I've been in places where, you know, you share the credentials on slack, and you have them in a file and you and email and people are. And it's just totally not secure, uh, at the end of the day, because there's so many points of leakage. Yeah. Well, you need to centralize. You need a centralized vault for your credentials. This is like step number one, but the part where you take the credentials out of the vault to actually inject them somewhere is the part which is often forgotten but is still very important. How do you need to share your credentials when you need to use them securely after all? Right. All right. So we're really here to talk about AI, right? So we've got we've got these we've got this clever and fast way to connect with remote, uh, remote desktops. Uh, and those are you do that as an IT person to do, uh, maintenance on the machines, to do upgrades to patches, fixes, configuration changes, all sorts of all sorts of reasons to do that. Uh, but, uh, people who are in that world are starting to rely more and more on chat to figure out what to do. Uh, a lot, a lot of things is like, hey, what are the PowerShell commands that do X, Y, or z? And once they've got those commands, they'll go try them and do that. So you looked at that and said that's just a little awkward. What did what did what did you think. We're streamlining the process. So basically in remote you can already connect to a lot of different systems. A lot of them use RDP. Rdp is still very popular for system integration. So just think of it this way. It's like you have RDM open, you're connected with RDP to a machine, and if you were using ChatGPT before or cloud to ask questions and copy paste commands on the other side, what we did is we can actually connect your AI system to RDM, in which you have your RDP session open, and then you can actually ask, can you, can you do an operation XYZ, like find how much free disk space I have, instead of going manually and clicking and figuring out how much you have, you can generate a PowerShell command on the fly, which you can review before executing. Of course, it will send a command, grab the output and tell you how much free space you have. You want to diagnose problems. You can ask questions. It will just generally commands to start diagnosing problems for you. Things that would take a lot more time or manual work or manual copy pasting. We just like made it possible through Remote Desktop Manager talking about RDP here, but SSH is also supported. Sql server is also supported. Lots of different protocols that we have already in RDM. We've worked to expose as an MCP server to your existing AI assistant, so you can skip all those manual steps of copy pasting data back and forth. So we've really, really just super powering our AI agents with the ability to go out and connect with all the things we used to connect with or connect with directly. Uh, which at one level seems like the world should be that way. But when we when we look at it practically, it hasn't been that way. And so it's just this little addition. I shouldn't say little, but with your addition of an MCP server, which is model context protocol for those people following their AI buzzwords and lingo, uh, you've wrapped your RDM with something. Now that can be a tool for an AI agent to use and power that up, right? That's really what's going on. Yes. And the main difference you said like MCP, we didn't even define what it is like. Mcp is like it's kind of like a remote procedure call for LLM. So your LLM needs to call a tool. How does it do it. Well MCP is a solution we have right now in the industry. So what RDM does is it exposes all the functionality that was already there in RDM. But now you can connect your AI assistant to it. Still, there are a lot of concern about the security of MCP. How does MCP secure? So we took an approach to correctly adapt MCP to a desktop application like RDM. So in our case we launch RDM. Then you connect to it. You don't launch a new RDM or you don't launch a background process. You connect to an existing one. So if you have like MFA to connect to something, you don't have to bypass the MFA. You just do MFA the regular way you would normally do it. Then you connect your MCP client. Now, uh, if you're familiar with MCP, one problem that you may have is if you want to expose an MCP server to, uh, other processes on local system, you could do a localhost HTTP server. The problem is this breaks a user boundaries. So we made a name server for MCP and RDM. And we made the bridge executable that this launches a subprocess. That is one of the standard ways to MCP. All it does is connects to a pipe server of RDM. So at least at a bare minimum only AI systems assistants running in the same user context as RDM can connect to it. That is not enough in itself. So we added another measure. You have a prompt, a confirmation prompt. From the RDM side, you just receive a connection request from MCP. Do you want to accept it? This is the program that's trying to connect, so you should not have this prompt if you're not actively connecting your AI assistant to it. So if you by mistake, you run a malicious command that tries to connect to the MCP server to use RDM, you would get this prompt. You just press deny. Uh, simple as that. All right. So some some easy security things, uh, for the user to deal with but have actually made this much better than if someone just opened up the pipes and connected it themselves on the back end. Right. It'd be too easy to just give the agent direct access to everything. So, uh, when, when people get get hold of this and they start to do this, what are some of the things just just off the top of your head? What are some of the things that become easy to do? Like just, uh, sitting I'm sitting in there and I've got my my chat assistant. What does it look like? Okay, well, diagnostics is something which is frequently used. Uh, but instead of asking cloud or ChatGPT, like, what should I look for to find why this thing isn't working properly, you can just ask, and then it can run the command for you. And then since it can capture the output, it can actually know a lot more detail and can iterate on that. And you can you just have to ask questions that will try to figure out ways to answer your question is what it can do. One important thing RDP does not support remote execution natively. So we built the solutions agent with a virtual channel extension for remote execution inside an RDP session, and this is correctly integrated into Remote Desktop Manager and now through the MCP server. The funny thing is we built this before MCP was even a word. Uh, so you could actually already just go and run commands from the RDM client side that will be pushed to the server for execution. But now when we were building the MCP server, we thought, well, if you have the RDP session open, now we have this agent running can actually start running commands through it and you can connect your AI system. Main advantage for you is you don't need to install and install an AI instance inside the remote server. You can use your existing one on the client side for all the systems that you can connect to through RDM. So this is game changer for a lot of people. Yeah, I think I think subtly in there is this idea that you're changing my chat conversation with an AI from one of give me advice or let me explore the space to like, do it. You know, just yeah, like like tell me what's wrong on that machine rather than how. Tell me how to find out what's wrong on that machine. Right. So it's kind of this shift in tense and shift in verb. And that is see changing for a lot of people. And if you want to be more careful about how you approach it, just tell them, tell me what you are about to do before doing it, and then it will tell you. Describe what I'm going to do. You can review what is about to execute and if it's fine with you, you just say go and then just do it for you. But now you don't have to actually, okay, I'm fine with this. Let me go and copy paste it. The other problem is if the LM can actually grab context automatically, if you have a chatbot which is completely separate from what you're using, it is unable to get that context automatically. You need to provide the context. So not only you need to copy paste the command at the end, but you will also have to copy paste a lot of information to feed the LM with the information. In the case of the SQL server entry that we have when you connect to SQL server, it will extract the database schema. So when it generates SQL commands, it will actually know about the tables and columns. Of course you review the command before it is executed, but otherwise you would have to tell it no. This is my column name. This is my table name. Now it just knows that data, uh, ahead of time without you even needing to tell it explicitly. This is what the name of the table is. Some very cool stuff. Uh, so if, uh, can I ask you, can I ask you slightly related question is, what does it take to deploy the devolution in practice so that I get the ability to have that remote execution and RDP and RDM and then and then how much harder is it to get the MCP server set up? Okay. If you're already an RDM user, well, you have step one. You need RDM. This is the obvious one. You need an AI assistant. So a lot of the experimentation that we do is with VS code and GitHub copilot. If you're a partial user, this may already be what you're using for running your PowerShell script. So it's super easy to make it to just connect an MCP, uh, server to it. Uh, otherwise cloud can also be used. So anything that is an AI assistant with uh, the ability to configure an MCP server can be used with RDM. So you choose neuron. Uh, we do not provide the AI assistant ourselves. We are working on building one inside RDM. Uh, that we got from early feedback from customers that they would like the option to bring their own AI system, or have how one which is built in. But at this point, what we're offering is an MCP server. So you bring your own AI assistant. This is a this is an interesting approach because instead of having yet another AI assistant, if you bring your own, you can start connecting your AI assistant to RDM. And on the other side, for instance, to confluence. Confluence is an excellent MCP server. So if we talk about like creating a new a new vault in RDM that matches your organizational structure, let's say you have a confluence page with your organizational structure already in place. You connect it to both RDM and and confluence. You can start by asking, can you pull the data from confluence about my organizational structure and what you're going to do? And then please create all of this structure in my data source. And then all the folders will be created matching the data that you have on the other data source. So you need to think about like we used to have one AI system to every tool. But now we need to have one AI system that can talk to multiple tools. An RDM is not one of those. Right? You really. You're really doing something kind of subtle here at one level. But at another level, you're making AI actionable for people instead of just advisory, which is very it's a very cool thing to step back at the end here of 2025 and say, okay, now, now I can see how AI can actually help me get my job done better and faster and not just tell me I should be doing a better job in some way. Uh, okay. So so so, Mark, if someone wants to learn a little bit more about, uh, RDM from evolutions and they want they're interested in this MCP server idea and they say, hey, I could start using this tomorrow. Where would you have them look? Uh, well, Google for evolutions, our main website, remote desktop manager. And then we encourage you to go into our forums to report feedback. And we listen very closely to that feedback. We have, uh, developers who are very active about making changes to the MCP server that way. We've been developing it inside is as we added more tools, we try them out and then we figured like, hey, now I have this tool. I'd like to just go one step further, and then it's a simple thing to just expose one more thing from the MCP server. It's just we do not know which things would need to be exposed for everybody's use cases. So the more you use it, the more you can make feature requests. And we can be very proactive about fixing those and improving those very quickly. All right. So, uh, check out Devolutions folks for, uh, RDM. Check out the MCP server that you can launch, uh, that accesses your remote desktops through RDM and can do remote desktop execution again, making your AI it AI chat actionable, uh, particularly as you combine this with other MCP servers from other things, it becomes a stronger and stronger system. Uh, appreciate you being here today, Marc, and explaining this to us. Well, it was a pleasure. Thank you. Thank you. And, uh, check it out, folks. Take care.