I've worked for the European Space Agency and EUMETSAT and it's crazy how much effort is wasted because everybody is writing separate mission control, monitoring and data analysis applications that do 95% of the same things, each with its own horrendous UX/UI, idiosyncrasies and bugs.
I'm not in that industry anymore but I just wish everybody would just grow up, use this (and related) software and contribute. No reason not to do that (of course except for pride).
This is actually a generic problem in public acquisition - projects are tightly funded to meet specific customer requirements and cannot themselves resolve enterprise-level problems. Building for re-use tends to add complexity / cost and a hard-nosed PM will not easily be pursued to solve someone else's problem. It seems to require top-down commitment of intent and resources - and a big stick - to make individual projects do the right thing.
Obviously there can be situations where common approaches are developed and used but this seems to be the exception rather than the norm.
(source - I've spent 10 years trying to work this issue in a UK public acquisition context)
This is an issue in enterprise situations as well, I have 100s of servers which are supposed to be "managed" but each project / product manager just adds their own company card to amazon and builds their own platform. The only way this can be solved is if it gets escalated to VP / CTO levels and they force everyone to follow the "standards".
oddly enough most enterprises I know have servers "standards" for their own data centres and would dearly love to just fork cash to amazon to get them out of that hell
Yup. The heads of IT at enterprises are begging to just switch everything to AWS/GCP/etc., while all the mid level IT guys are starting to see the writing on the wall. Much of what they do will not exist anymore and they will need to retool themselves and/or find a new job.
This is quite true. At the end of the day, you've got to meet the requirements, and there are design choices you can make so that the solution for one customer is clean and elegant and yet completely unusable for another customer.
In any build vs. buy situation, "build" is driven by the desire to enlarge someone's empire, while "buy" is driven by personal relationships and other sales techniques.
The first maximizes "unique requirements" while the second minimizes them.
Between the two, I generally prefer "build", bad as it is, because it leads to fewer horror shows later.
That was naturally the reason why they were created in first place, but not why they couldn't standardize into a single one across the institute research groups.
It's the same in automotive industry (development and end-of-line-testing) (XCP and MDF helped, but ultimately devolved into a meta-standard for vendor specific functions) and in the rest of the industry I've had contact with (cooling, heating), fragmentation is as bad as ever.
Standardizing on web standards will probably follow the pattern of making of adding another layer that is worse than anything in the previous 30 years.
> ...it's crazy how much effort is wasted because everybody is writing separate mission control, monitoring and data analysis applications that do 95% of the same things, each with its own horrendous UX/UI, idiosyncrasies and bugs.
Heh, sounds like they have a lot in common with JavaScript frameworks and web development in general. With this observation in mind I doubt making mission control "web based" will rectify the situation...
What do you mean? I save a lot of time in web development using solutions created by others. I don't think this problem is unique to JS/web development but to software development as a whole. "Not invented here" syndrome is quite prevalent in a lot of areas and sometimes lack of options doesn't mean it will make the product any better. Quite the contrary it can mean that everyone is using the same buggy solution because there are no alternatives.
I do not know the details of mission control technology but my uneducated guess would be that this is a cultural/management problem in these organizations. Top-down hierarchy with very large resistance to change would make introducing of new ideas (open source solution in this case) a tough job for any single actor. Also the IT-contractors or companies benefitting from selling these solutions are probably doing everything they can to keep the status quo as it is.
> What do you mean? I save a lot of time in web development using solutions created by others.
How much time do you lose keeping track of those solutions and their dependencies?
The JS world has probably the hugest fragmentation problem in software history. Every level in the stack is rapidly evolving, constantly spawning multiple half-baked alternative solutions to the same problems. I think GP's point is simply that if similar things are happening in NASA on small level and are considered a problem, then switching to a technology ecosystem that is a living embodiment of those issues isn't going to help.
Lead Dev for Open MCT here, excited to see this show up while I was reading the news last night! Happy to answer questions if I can, and please don't hesitate to contact us using the email address on the website as well.
We use them more and more in the public sector, and they can be perfectly fine.
It’s a little silly, but it’s where most of the UI innovation and technical skills lie at the moment, and we’re frankly getting to the point where there isn’t that much of an alternative.
This is great. What did NASA use before? I always got the impression they were still using a Unix-type OS and CDE or some other Motif based WM far into the 2000's.
This would really modernize things. And I love how the whole project is so clearly made to enable contribution from the open source community. Really aspiring to leverage open source in the best possible way.
I guess they'd have to make it as easy as possible because the project might not have many uses outside of NASA.
I spent the bulk of the first 8.5 years of my career in and around JPL’s Spacecraft Assembly Facility (SAF), Test Engineering Lab (TEL), Assembly, Test, and Launch Operations (ATLO), aka, Building 179.
A lot of software systems in Operations a little more than 10 years ago were (Open?)Motif on Solaris. I remember helping a colleague transition to Qt when Motif’s event system just couldn’t handle an extra 30 data streams we were projecting for use by Mars Science Lander. It was something like a 30 minute demo of Qt Designer (with Hal, the software engineer actually tasked) and him giving up a weekend to full replace all the Motif code. The codebase had shrunk by more than half, all 30 additional data streams had been added, and the Ops Engineers reported that the app was noticeably snappier even with the new streams.
The truth is, there are still 5+ competing options within NASA, with each center having staff familiar with different tools, and with a generally large learning curve for new tools it is a hard sell to use something new. Most of the time, the tools you used on the last mission are the same tool you will use on the next mission.
We’ve seen remarkable uptick in OpenMCT usage at different centers purely because it is open source and folks can use it for free. Other similar products require you to contact the supporting org, request a delivery of the software, spend days/weeks deploying and configuring it for your mission, before you can even begin to evaluate the software. We’re finding that by lowering barriers to access and by allowing others to modify it to suit their needs, we are getting much better buy in from operators and missions.
I'm at NASA JSC. We're using Open MCT in a few research projects to monitor simulated spacewalks. We've got dashboards where we're tracking a mix of human physiology data (heart rate, CO2 flow, etc) and spacesuit telemetry (that's mostly simulated at this point).
Great tool! We're looking at relying on it more heavily. I'm actually planning on developing a new widget for Open MCT in the near future :)
This is very cool! i got into high powered model rocketry as a hobby and made my own very basic, command line driven, mission control software. My rocket has a little raspberry pi zero wired up with an IMU and some other sensors. On the pad it connects to my mission control system over wifi running on my laptop. I get to do the whole launch sequence and "the launch computer has taken over the countdown" thing (technically a no-no but it's just for fun on my own). The onboard flight computer detects apogee using a barometer, accelerometer, and a kalman filter which then deploys the chute with a little black powder charge. It's a very fun and educational hobby but can get $$
I don't know how I feel or what I think about running mission control from a browser. Seems like it'd probably lead to lower defects to build something very simple very close to the metal — and probably not at a prohibitive level of cost, either. The advantage would be cutting out the OS, the desktop environment, the fancy GPU software, the browser, the JavaScript interpreter, the JavaScript dependencies &c. from the critical path. The disadvantage of course would be running homegrown software which performs some (but not all!) of those functions.
Maybe I'm wrong, though. I'm certainly open to being persuaded.
I made the transition from writing native QT applications in C++ to javascript front ends (generally React) a few years ago. In my experience, writing robust front-end applications was much harder in C++ than it is in javascript. The memory management in QT can get pretty complicated for large applications, and a single mistake can get you a segfault crash. Not to mention that it's much easier to hot-patch a javascript file on the server and then ask everyone to please hit CTRL-R on their browsers than it is to recompile and redistribute a binary to all my clients. At this point, all of the UI components at my company are js based and I couldn't be happier.
For those interested, Ball Aerospace has a similar open source tool called COSMOS written in Ruby. It has been used for a number of missions there as well as some CubeSats:
Great to see this getting open-sourced but let's be honest: without support for PUS, CCSDS and other ECSS standards (or any alternative) this only covers a single-digit percent of the effort needed to develop a mission control system.
Correct-- there are other components of a ground data system which handle the space link and and other services, but that is outside the scope of Open MCT. We integrate with a large number of those systems via plugins although it is difficult (for non-technical reasons) to open source all of that work.
"mission control" is a relatively opaque concept and I hope that by open sourcing components we can shed more light on the overall architecture and complexity of mission systems and perhaps even begin to simplify them.
I recently uploaded a video in youtube from Jay Trimble's talk about Open MCT from the Open Source Cubesat Workshop 2018 that took place in Madrid. Feel free to check it out: https://www.youtube.com/watch?v=y6Dh74INR_I
On the one hand modernizing such things is really, really good initiative. On the other hand JavaScript/CSS frameworks tend to be very actively developed, thus changing the API or even deprecating the whole framework. This might be the problem for mission critical software.
Why would it be a problem? You freeze the dependencies and then occasionally remind yourself that despite your use of web technology you absolutely do not want to control your mission from devices exposed to the regular web, where Updates Matter.
VueJS will replace a subset of our Angular 1.x usage, and we will remove Angular 1.x, but it's not fair to describe it as a "swap": they do different things.
Very cool. I visited Johnson Space Center mission control in Houston last week and was reminded how really smart people are monitoring and flying the ISS 24/7.
ISS has planned communication outages resulting from satellite signal loss. Everyone in mission control knows when these disconnects will occur and how long they’ll last and plan their breaks around them.
I hope you will see this comment. I wanted to let you know that your account has apparently been "shadowbanned", which means that you can see your own comments but no one else sees them unless they have "showdead" turned on in their account settings.
If you view an HN page that you commented on in an incognito window, you will see what I'm talking about.
I can only guess that the moderators took this action because of the large number of Amazon affiliate links you have been posting.
Its fine to post an occasional product link when it relates to the topic, but affiliate links are not so welcome - especially when they are disguised behind an amzn.to shortened URL. Just post an original Amazon link, with everything removed from the URL except the minimum required to go to the correct page. The URL should look like https://www.amazon.com/dp/NNNNNNNNNN/ where the N's are the ASIN.
Some of your other comments, like this one, are good quality and people have "vouched" them which makes them visible to all.
I suggest you email the moderators (address is somewhere in the links at the bottom of every page) with an apology and a promise to not post any more affiliate links. Maybe they will reinstate your account.
Yes, as I said, this is because someone vouched it. Try turning on "showdead" in your account settings and view their comments in a logged in window, and you will see all the dead comments. Then view the same URL incognito or without showdead and you will see what I'm talking about.
I'm not in that industry anymore but I just wish everybody would just grow up, use this (and related) software and contribute. No reason not to do that (of course except for pride).