git commit --amend --no-edit
This helped me countless times…
git commit --amend --no-edit
This helped me countless times…
Actually you can… I do that with my setup. Just point your domain to the new ip assigned by tailscale to your server. Thats all. Recently they started supporting the https certificate also… Even though it’s not needed, for internal only communication.
Replacing a human with any form of tech has been a long standing practice. Usually in this scenario the profitability or the efficiency takes a known pattern. Unfortunately what you said is the exact way the market always operated in the past, and will be operating in the future.
The general pattern is a new tech is invented or a new opportunity is identified, then a bunch of companies get into the market as competing entities. They offer competing prices to customers in an attempt to gain market dominance.
But the problem starts when low profit drives some companies to a situation where either they have to go bust or dissolve the wing, or sell the company to a competitor. Usually after this point a dominant company will emerge in a market segment. Then the monopolies are created. After this point companies either increase the price or exploit customers to get more money, and thereby start making profits. This has been the exact pattern in tech industries for several decades.
In the case of AI also, this is why companies are racing to capture market dominance. Early adopters always get a small advantage and help them get prominence in the segment.
This is something people always miss in these discussions. A graphic designer working for a medium marketing company is replaceable with a Stable Diffusion or Midjourney, because there, quality is not really that important. They work on quantity and “AI” is much more “efficient” in creating the quantity. That too even without paying for stock photos.
High end jobs will always be there in every profession. But the vast majority of the jobs in a sector do not belong to the “high end” category. That is where the job loss is going to happen. Not for Beeple Crap level artists.
I completely agree with this. I work as a User Experience researcher and I have been noticing this for some time. I’m not a traditional UX person, but work more at the intersection of UX and Programming. I think the core problem when it comes to discussion about any software product is the people talking about it, kind of assuming everyone else functions the same.
What you mentioned here as a techie, in simple terms is a person who uses or has to use the computer and file system everyday. They spend a huge amount of time with a computer and slowly they organise stuff. And most of the time they want more control over their stuff, and some of them end up in Linux based systems, and some find alternative ways.
There are two other kinds of people. One is a person who uses the computer everyday but is completely limited to their enterprise software. Even though they spend countless hours on the computer, they really don’t end up using the OS most of the time. A huge part of the service industry belongs to this group. Most of the time they have a dedicated IT department who will take care of any issue.
The third category is people who rarely use computers. Means they use it once or twice in a few days. Almost all the people with non-white collar jobs belong to this category. This category mainly uses phones to get daily stuff done.
If you look at the customer base of Microsoft, it’s never been the first. Microsoft tried really hard with .NET in the Balmer era, and even created a strong base at that time, but I am of the opinion that a huge shift happened with wide adoption of the Internet. In some forum I recently saw someone saying, TypeScript gave Microsoft some recognition and kept them relevant. They made some good contributions also.
So as I mentioned the customer base was always the second and third category. People in these categories focus only on getting stuff done. Bare minimum maintenance and get results by doing as little as possible. Most of them don’t really care about organising their files or even finding them. Many people just redownload stuff from email, message apps, or drives, whenever they need a file. Microsoft tried to address this by indexed search inside the OS, but it didn’t work out well because of the resource requirements and many bugs. For them a feature like Recall or Spotlight of Apple is really useful.
The way Apple and even Android are going forward is in this direction. Restricting the user to the surface of the product and making things easy to find and use through aggregating applications. The Gallery app is a good example. Microsoft knew this a long back. ‘Pictures’, ‘Documents’ and all other folders were just an example. They never ‘enforced’ it. In earlier days people used to have separate drives for their documents because, Windows did get corrupted easily and when reinstalling only the ‘C:’ drive needs to be formatted. Only after Microsoft started selling pre-installed Windows through OEMs, they were able to change this trend.
Windows is also pushing in this same direction. Limiting users to the surface, because the two categories I mentioned don’t really ‘maintain’ their system. Just like in the case of a car, some people like to maintain their own car, and many others let paid services to take care of it. But when it comes to ‘personal’ computers, with ‘personal’ files, a ‘paid’ service is not an option. So this lands on the shoulders of the OS companies as an opportunity. Whoever gives a better solution people will adopt it more.
Microsoft is going to land in many contradictions soon, because of their early widespread adoption of AI. Their net zero global emission target is a straightforward example of this.
Well… I think you are putting too much expectation on a common person. I’m pretty sure a lot of people are going to be ‘mind blown’, by the ability of the new Recall feature. They will hail it as a technological marvel. Very few people care about privacy, and even in that, very few people really understand how they can have some privacy. Complete privacy is near to impossible.
Come on… don’t be so pessimistic!!
The last part about the stack overflow account and forum account is about the infamous silk road admin, Ross Ulbricht
I actually use Nginx. The major advantage is if you have to access something directly. For example a client app in your device wants to access a service you host. In that case Heimdall won’t be enough. You can still use ip with port, but I prefer subdomains. I use Nginx Proxy Manager to manage everything.
Regarding the network going down, the proprietary part of the tailscale is the coordination server. There is an open source implementation of the same, called headscale. If you are okay with managing your own thing, this is an alternative. Obviously the convenience will be affected.
Apart from that, if you haven’t already read this blog post on How tailscale works? I highly recommend reading this. It gives a really good introduction to the infrastructure. Summary is your connections are P2P, using wireguard. I don’t think tailscale will have a failure scenario that easily.
I hope this helps.
The exact setup can be achieved by tailscale, a not really known feature is you can point your domain to the, tailscale IP (new ip assigned by tailscale), and it will act just like a normal hosting setup.
Advantage, any device or someone who you do not pre approve can’t see anything if they go to the domain and subdomain. They only work if you are connected and authenticated to tailscale network. I have a similar setup, if you need more pointers please ping me.
If you have used AnyDesk in the past, this gives the same experience. Recently used it and has a lot of features, including unattended access.
They recommend self hosting an instance for better performance.
Yes it is not in alignment with the spirit of open source. In the “industrial districts” there is no validity for copyrights. Means if one company developed something, any other can adapt it without any restriction, even without a license. This is very counter intuitive to our capitalistic rules. But this policy essentially forces you to make progress as quick as possible, else someone else will adapt it and make a product out of it. Then you lose all the market.
China is forcing companies to make money out of capitalistic economies, but restricts the “knowledge” or “technology” accumulation into a few mega corporations.
At least this is the theory. But as everywhere else corruption and hunger for power screws up things in China also.
Syncthing is used if it is not a one time transfer. LocalSend is mainly for one time transfer. LocalSend needs things to be in the same network. The same WiFi router is enough. Syncthing can send files over the internet also.
There are browser based alternatives like ShareDrop . These tools are not as reliable as Syncthing and LocalSend, especially when it comes to single large files (more than a few GBs), like ISOs.
For one time transfer over the internet, another handy tool is Croc . This one also suffers from the large file related issues.
Hmm… Machine learning on a dataset with images?
Recently they officially added a module to censor stuff on an individual instance basis…
The what?? Is it a condition?
The song has been sung…
One thing you can try out, if you haven’t done already, is configuring 2 different ports for the two users here. GUI has an option to adjust the ports, also you can configure two different services to start depending on the logged in user. I haven’t done it myself on Linux, but it looks like people had success. One R*ddit thread for example,
IP should not cause any issues. IDs are just a hash of certificate used by Syncthing. Can you elaborate a little on the current setup? Device, OS, User, etc. Also if possible can you explain your use case? As I mentioned, Syncthing is very specific to what it can do, so it may not be the best solution for your case.
The site is Sansec. They uncovered it. They also specify how the malware redirects users to sports betting sites.