I can't speak for the parent, but I've been using RAII and smart pointers in the 00s and it provided a lot of the benefits that got standardised with C++11.
RAII and smart pointers definitely were a thing in the 90s. I wrote lots of COM code using these techniques. According to wikipedia, RAII was invented in 1984-89.
The average C++ codebase isn't from the 90s. On all the recent c++ polls the average language revision used is between c++14 and 17.
Besides I'm pretty confident that there are more new c++ projects created daily in 2021 than monthly at the peak of the 90s c++ craze - just on GitHub, 6/7% of C++ repos means a few million recent C++ repos.
I've worked on several code bases that nominally are C++11 or 14. However they still contain a lot of code written by people still coding like it's the 90s.
we are discussing the sentence "The average C++ code base has as many segfaults than the average C code base."
__s and you said "You were using C++ >= 11 in 90s/00s?" to which I answered that this was not the point, because the average C++ code base isn't from the 90s/00s.
> On all the recent c++ polls the average language revision used is between c++14 and 17.
Polls of hobbyist coders, or software houses? I would be surprised if most software houses migrated to C++17 yet. Tensorflow is stuck on C++03 I think.
it does not mean that you're not using C++11. This macro is just a compatibility flag for your code to work on old linux distros that provide a C++11 compiler but did not want to rebuild their whole archive. It mainly means that std::string is implemented with copy-on-write instead of small buffer optimization.
In 1992, I was working on the Taligent project, probably the first major C++ operating system. (It failed.) I remember when the ARM came out---none of the compilers we had available could really do templates. Or namespaces.
Oh no... ROOT. It's a testament to the pure grit and gumption or thousands of poor undergraduates that particle physics can advance, with this. Eons ago, I tried several times to help my then-girlfriend (you know how it goes 'hey you have some kind of eng diploma'? Yes sw engineering... - Hu so you know C++? - nobody 'knows' C++ but I can manage 'so here what I'm trying to do, here are 3 other examples, please for the love of Wotan help') and I was baffled on how to do anything with it. I mean the core thing seems powerful enough, but trying to go out of the beaten path (research, right ?) was yugely frustrating... And I'd worked on 2 physics codebases or variable quality before. I didn't appear as competent as I'd hoped and spent so much time helping, reading docs and code without understanding much of the design. This is the codebase that started my deep defiance for OOP and especially OOP-as-a-mirror-of-the-real-world and inheritance-for-code-economy...
I worked with C++ & MFC in mid-late 1990s, smart pointers weren't an option. Maybe if you do something against MS oddball APIs, but not in general programming, not even for mainstream MFC uses. And what was there was entirely non-idiosyncartic, it's like claiming C++ had garbage collection in 1990s because you could bolt on Boehm's.
There's genrule() which basically runs a shell script. With that, you can build anything for which you can write a command-line. And you can go as far as invoke a binary that is built by another bazel rule.
You can also define your own rules in a subset of Python (by wrapping other rules including genrule()), so adding new languages should be simple enough.
Online content is not exactly the same as physical goods, but it's not that far either:
- it costs very real money to produce, and
- it costs very real money to be served and to remain online.
These costs may be covered by ads, by subscriptions, by sponsors, taxes, voluntary contribution or anything else, but the they still need to be paid for.
First, be very careful about burning out. It can happen faster than you think if you have to force yourself to do the work every day.
Second, maybe you should find a different job - or at least change something about your current job. Remote work is not the best option for everyone - and it can be even worse if you work from home. Or maybe there's something wrong with your project, the technologies you use or the people you work with. Changing any of these might improve your situation. Start experimenting - you don't necessarily have to quit your job right now. Try working from different places (coworking space, university library, coffee shop, your friend's couch, ...). Speak with your manager about other projects, learning and growth opportunities and other things you might do that you would enjoy. And if this is something you can't bring up with your current manager, start looking for a new manager.
I was in a similar situation several years ago. I worked remotely (from home) for a small startup-like* company and in the end, I had to use the pomodoro technique and similar tricks every day just to make myself spend time really working. Even though I liked my team as people and we had a great CTO, it wasn't that great on the engineering side. Using ASP.net WebForms and Visual Studio 2008 (no management support for an upgrade) didn't help either. Neither did the fact that there was no product or customers to care about and relate with.
In the end, I felt really burned out and kept trying things just to keep working - working from the company office helped a bit, but it was really noisy there. Similar for my university lab (I was a student at that time), but it wasn't a long term solution. In the end, I got an internship at a research team at Google and started enjoying work again almost immediately. I even stayed there as a full-timer and I've never looked back. Even after four years I still look forward to going to work every single day because of the awesome people I work with and all the exciting projects we work on. There are annoying things like open-space offices, but the people and the projects more than compensate for it.
I've changed a lot of things at once to be 100% sure what made the biggest difference for me, but my bet is still on the people on the team and projects (no more asp.net and business plans changing every week). But I believe that even the change of work environment played its role.
* When I joined, the company existed for at least five years. But otherwise, it was very much like a startup - we had a product in development, but there were almost no customers and even the vision changed a lot - during one fruitful week, we went from an app generator to a social network for some demographic group, a government information system and back to apps. I guess it changed with the article the CEO read during lunch on a particular day. Interestingly, the management always convinced our investor to give us more money - maybe they held a child of the investor hostage or something like that.
The price for Hong Kong looks weird, NomadCost per month in HK is roughly 350 EUR lower than the monthly rental costs. This is probably due to the very low listed price of the hostel and budget hotel rooms.
This is definitely not a short-term solution, and it might not be practical even in mid-term, but wireless charging sounds like a perfect solution for this kind of environments.
The hospital rooms could have special "charging" spots on the floor/build into the furniture or just a hole in the wall. The patients would charge their devices simply by placing them on the charging spot/shelf. Now there are no potentially dangerous cables or plugs, and all charging can be done without devices leaving the patients' reach, reducing the risk of theft at the same time.
There's already an international standard for wireless charging. The only problem is that it works only with a couple of the current phones, and it's not very likely to appear in low-cost low-end devices that the patients of the mental hospitals most likely to use. But if it ever gets a wider adoption, this would solve a lot of problems...
I don't know what they do if you ask for a battery replacement or when something's wrong with the SSD, but changing a display on my 2011 MacBook Air was a matter of 20 minutes. I'd be very surprised if you couldn't at least replace the moherboard.
Which, by the way, is the only type of repair the other laptop manufacturers ever offered to me.
They probably replaced the lid assembly whereas most vendors you can replace the panel with little effort (I did it on my ThinkPad in 30 minutes with the wrong tools for the job :)
Considering the SSD and battery have the shortest lifespan of all components I can't fathom why they decided to make them non replaceable. Even in worst case scenarios, it's a recyclers nightmare.
You really don't have a clue what the rest of the OEMS are doing. Anything at the ultra book level is essentially two pieces. Your huge mega laptops are still user serviceable (like a bricky T class), but not say....a carbon.
These two facts are not necessarily in contradiction. At this moment in the current Humble bundle, the average price paid by Linux and Mac users is significantly higher than the average price paid by Windows users, but the total income from Windows is still way higher than the income from Linux and Mac combined, just because of the raw numbers of users.
It looks to me like there are some Linux users who don't mind buying games (and do not mind paying premium for that), but the majority is not interested in buying games at all, regardless of the price.
Moreover, my own experience with buying games is that (digital distribution aside) it was practically impossible to get Linux versions of games (well, at least in Czech rep., where I used to live). And if there was a Linux version at all, using it meant buying a box with the Windows version and the patching it. So, even though I use Linux for work, I used to play games almost exclusively in Windows (and then on XBox, which made things even easier).