Connect with us

Tech

Update Those Tech Gifts Before Putting Them Under the Tree. Here’s How

Published

on

Lengthy setup can put a real damper on the joy of a cool gift—both on the giving and receiving end. Immediately running into updates after turning on a new tech toy can be a real bummer, and downloading files instead of participating in a holiday gathering isn’t so fun, either.

There’s an easy fix, though: Handle all that work before you give a tech gadget.

Not only is this advice tried and true, but it becomes even more relevant with each passing year. Companies keep leaning heavier and heavier on immediate updates for shipped products. As a result, some of the most popular tech gifts can be real time sinks, especially new gaming PCs and consoles. Take the time now to apply patches, install necessary apps, and download huge games, and your recipient will get to dive straight in. Meanwhile, you can continue to hang out in the background and relax.

Our list below names the most common tasks to complete for getting your tech gifts into a truly ready state. It’s not just PCs, Xboxes, and PlayStations—even low-key items like fitness trackers can use this kind of attention. For example, Fitbits sometimes have firmware updates that can take upward of 40 minutes to complete. (Seriously.) 

Of course, some individuals love the setup process and see it as part of the ownership experience. If your giftee is type this person, hold off on updates. Same too if there’s a chance you may have to exchange your present for something else. (Check the store’s return policy on opened items first.) But most people appreciate this gift of time.

Laptops & desktop PCs

Adam Patrick Murray / IDG

We explain how to set up a new Windows computer in more detail, but here’s a brief summary of what to make time for:

Running Windows UpdateSetting up security for the PC, including installation of third-party malware software (if so desired)Removing any unnecessary bloatwareInstalling necessary or useful programs (e.g., a favorite browser, password manager, document and spreadsheet editors)Installing fun software like games or music-streaming apps

If you’re setting up a Windows 11 PC, you may want to also consider installing a program that allows the emulation of Windows 10’s look and feel—it can be helpful for loved ones easily thrown by user interface changes. The most well-known option is Start11, which costs $5. (For more details, you can read our hands-on with the program.)

Gifting a Chromebook? You can skip preemptive setup if you want. Chromebooks require such little maintenance that most people can do it themselves. They only need to review the privacy policy, connect to Wi-Fi, and check for any ChromeOS updates. But if you prefer to do this for them, it’s easily completed via the guest account.

Xbox, PlayStation, and Nintendo game consoles

Microsoft

Consoles don’t require much maintenance, but both the system updates and the game downloads can be huge. If you don’t have a blazing fast internet connection, it can take literal hours to get everything you want onto the system. 

You’ll want to download and install:

System updatesController updatesApps (Netflix, Hulu, etc.)Games

If the console is for a young household member, you may also want to create and log into the account they’ll be using (as applicable), plus activate any online memberships like Xbox Live Gold, PlayStation Plus, or Nintendo Switch Online. 

Not sure if you need to get an online membership? It depends on the type of console you bought and the features you’re seeking. They’re generally required for cloud saves (except for Xbox), the ability to play online, and access to a selection of free games for the duration of the subscription. (Interested primarily in a subscription to lots of Xbox games? Check out how to get Xbox Game Pass Ultimate for cheap.)

Phones and tablets

Google

Most people can handle set up of a phone or a tablet on their own, and given how apps are tied to a particular account, somewhat necessary. But for the techno-phobic, putting a phone or tablet in full working order can be a gift unto itself.

Presumably, you’ll either be setting up their first account, or already manage their Apple or Google account. That should make it easy to:

Configure their system settings (display, ringtones, etc)Applying any waiting system updates and security patchesDownloading and logging into appsConfiguring apps (as applicable)

You’ll still probably have to help them with setting up security measures like FaceID or a fingerprint reader (along with their PIN or password) in person. But doing this cuts down on a lot (a lot) of setup time later on.

Tech gadgets

Rob Schultz / IDG

This category can be a bit of a toss-up. Some devices, like wireless earbuds and Kindles, work out of the box without requiring a mandatory update. Others like recent Fitbits can have a firmware update waiting, and it’ll get tacked on to normal setup with no easy out.

You can wait on updating these products—they usually don’t take as long as setting up a PC, game console, phone, or tablet from scratch. But if you want to do it in advance, generally you’ll download a companion app to your phone or tablet, pair it with your device, and then check for new firmware.

Streaming gear (Roku, Amazon Fire TV, Apple TV)

Jared Newman / IDG

Most people can handle setup of these on their own, but if you manage the streaming accounts for someone not very tech-savvy, log into all the services they use before giving the device to them.

Smart home gear

Wyze

Truth be told, this gear is best left to the gift recipient to setup—the best smart home products are fairly easy to install and come with good user guides. 

But if you’re concerned about the difficulty level, and the device supports Wi-Fi, you can connect it to your router, install the latest firmware, and set up preferences. The gift recipient can then later change the Wi-Fi info.

If the device connects to a smart home hub, you’re best off leaving setup all to the other person. (Or setting aside a separate time to do it for them.)

Source Here: pcworld.com

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Video Friday: an Agile Year

Published

on

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRA 2022: 23–27 May 2022, Philadelphia
ERF 2022: 28–30 June 2022, Rotterdam, Germany
CLAWAR 2022: 12–14 September 2022, Açores, Portugal

Let us know if you have suggestions for next week, and enjoy today’s videos.

Agility had a busy 2021. This is a long video, but there’s new stuff in it (or, new to me, anyway), including impressive manipulation skills, robust perceptive locomotion, jumping, and some fun constumes.

[ Agility Robotics ]

Houston Mechatronics is now Nauticus Robotics, and they have a fancy new video to prove it.

[ Nauticus ]

Club_KUKA is an unprecedented KUKA show cell that combines entertainment and robotics with technical precision and artistic value. All in all, the show cell is home to a cool group called the Kjays. A KR3 AGILUS at the drums, loops its beats and sets the beat. The KR CYBERTECH nano is our nimble DJ with rhythm in his blood. In addition, a KR AGILUS performs as a light artist and enchants with soft and expansive movements. In addition there is an LBR iiwa, which – mounted on the ceiling – keeps an eye on the unusual robot party.

And if that was too much for you to handle (?), here’s “chill mode:”

[ Kuka ]

The most amazing venue for the 2022 Winter Olympics is the canteen.

[ SCMP ]

A mini documentary thing on ANYbotics from Kaspersky, the highlight of which is probably a young girl meeting ANYmal on the street and asking the important questions, like whether it comes in any other colors.

[ ANYbotics ]

If you’re looking for a robot that can carry out maintenance tasks, our teleoperation systems can give you just that. Think of it as remote hands, able to perform tasks, without you having to be there on-location. You’re still in full control, as the robot hands will replicate your hand movements. You can control the robot from anywhere you like, even from home, which is a much safer and environmentally-friendly approach.

[ Shadow Robot ]

If I had fingers like this, I’d be pretty awesome at manipulating cubes too.

[ Yale ]

The open-source, artificially intelligent prosthetic leg designed by researchers at the University of Michigan will be brought to the research market by Humotech, a Pittsburgh-based assistive technology company. The goal of the collaboration is to speed the development of control software for robotic prosthetic legs, which have the potential to provide the power and natural gait of a human leg to prosthetic users.

[ Michigan Robotics ]

This video is worth watching entirely for the shoulder-dislocating high-five.

[ Paper ]

Of everything in this SoftBank Robotics 2021 rewind, my favorite highlight is the giant rubber duck avoidance.

[ SoftBank ]

On this episode of the Robot Brains Podcast, Pieter talks with David Rolnick about how machine learning can be applied to climate change.

[ Robot Brains ]

A talk from Stanford’s Mark Cutkosky on “Selectively Soft Robotics: Integration Smart Materials in Soft Robotics.”

[ BDML ]

This is a very long video from Yaskawa which goes over many (if not most or all) of the ways that its 500,000 industrial arms are currently being used. It’s well labeled, so I recommend just skipping around to the interesting parts, like cow milking.

[ Yaskawa ]

Source: spectrum.ieee.org

Continue Reading

Tech

20K WordPress Sites Exposed by Insecure Plugin REST-API

Published

on

The WordPress WP HTML Mail plugin for personalized emails is vulnerable to code injection and phishing due to XSS.

Original Source: threatpost.com

Continue Reading

Tech

Legged Robots Learn to Hike Harsh Terrain

Published

on

Robots, like humans, generally use two different sensory modalities when interacting with the world. There’s exteroceptive perception (or exteroception), which comes from external sensing systems like lidar, cameras, and eyeballs. And then there’s proprioceptive perception (or proprioception), which is internal sensing, involving things like touch, and force sensing. Generally, we humans use both of these sensing modalities at once to move around, with exteroception helping us plan ahead and proprioception kicking in when things get tricky. You use proprioception in the dark, for example, where movement is still totally possible, you just do it slowly and carefully, relying on balance and feeling your way around.

For legged robots, exteroception is what enables them to do all the cool stuff—with really good external sensing and the time (and compute) to do some awesome motion planning, robots can move dynamically and fast. Legged robots are much less comfortable in the dark, however, or really under any circumstances where the exteroception they need either doesn’t come through (because a sensor is not functional for whatever reason) or just totally sucks because of robot-unfriendly things like reflective surfaces or thick undergrowth or whatever. This is a problem because the real world is frustratingly full of robot-unfriendly things.

The research that the Robotic Systems Lab at ETH Zürich has published in Science Robotics showcases a control system that allows a legged robot to evaluate how reliable the exteroceptive information that it’s getting is. When the data are good, the robot plans ahead and moves quickly. But when the dataset seems to be incomplete, noisy, or misleading, the controller gracefully degrades to proprioceptive locomotion instead. This means that the robot keeps moving—maybe more slowly and carefully, but it keeps moving—and eventually, it’ll get to the point where it can rely on exteroceptive sensing again. It’s a technique that humans and animals use, and now robots can use it too, combining speed and efficiency with safety and reliability to handle almost any kind of challenging terrain.

We got a compelling preview of this technique during the DARPA SubT Final Event last fall, when it was being used by Team CERBERUS’ ANYmal legged robots to help them achieve victory. I’m honestly not sure whether the SubT final course was more or less challenging than some mountain climbing in Switzerland, but the performance in the video below is quite impressive, especially since ANYmal managed to complete the uphill portion of the hike four minutes faster than the suggested time for an average human.


Learning robust perceptive locomotion for quadrupedal robots in the wild

www.youtube.com

Those clips of ANYmal walking through dense vegetation and deep snow do a great job of illustrating how well the system functions. While the exteroceptive data is showing obstacles all over the place and wildly inaccurate ground height, the robot knows where its feet are, and relies on that proprioceptive data to keep walking forward safely and without falling. Here are some other examples showing common problems with sensor data that ANYmal is able to power through:

Other legged robots do use proprioception for reliable locomotion, but what’s unique here is this seamless combination of speed and robustness, with the controller moving between exteroception and proprioception based on how confident it is about what it’s seeing. And ANYmal’s performance on this hike, as well as during the SubT Final, is ample evidence of how well this approach works.

For more details, we spoke with first author Takahiro Miki, a PhD student in the Robotic Systems Lab at ETH Zürich and first author on the paper.

IEEE Spectrum: The paper’s intro says “until now, legged robots could not match the performance of animals in traversing challenging real-world terrain.” Suggesting that legged robots can now “match the performance of animals” seems very optimistic.hat makes you comfortable with that statement?

Takahiro Miki: Achieving a level of mobility similar to animals is probably the goal for many of us researchers in this area. However, robots are still far behind nature and this paper is only a tiny step in this direction.

Your controller enables robust traversal of “harsh natural terrain.” What does “harsh” mean, and can you describe the kind of terrain that would be in the next level of difficulty beyond “harsh”?

Miki: We aim to send robots to places that are too dangerous or difficult to reach for humans. In this work, by “harsh”, we mean the places that are hard for us not only for robots. For example, steep hiking trails or snow-covered trails that are tricky to traverse. With our approach, the robot traversed steep and wet rocky surfaces, dense vegetation, or rough terrain in underground tunnels or natural caves with loose gravels at human walking speed.

We think the next level would be somewhere which requires precise motion with careful planning such as stepping stones, or some obstacles that require more dynamic motion, such as jumping over a gap.

How much do you think having a human choose the path during the hike helped the robot be successful?

Miki: The intuition of the human operator choosing a feasible path for the robot certainly helped the robot’s success. Even though the robot is robust, it cannot walk over obstacles which are physically impossible, e.g., obstacles bigger than the robot or cliffs. In other scenarios such as during the DARPA SubT Challenge however, a high-level exploration and path planning algorithm guides the robot. This planner is aware of the capabilities of the locomotion controller and uses geometric cues to guide the robot safely. Achieving this for an autonomous hike in a mountainous environment, where a more semantic environment understanding is necessary, is our future work.

What impressed you the most in terms of what the robot was able to handle?

Miki: The snow stairs were the very first experiment we conducted outdoors with the current controller, and I was surprised that the robot could handle the slippery snowy stairs. Also during the hike, the terrain was quite steep and challenging. When I first checked the terrain, I thought it might be too difficult for the robot, but it could just handle all of them. The open stairs were also challenging due to the difficulty of mapping. Because the lidar scan passes through the steps, the robot couldn’t see the stairs properly. But the robot was robust enough to traverse them.

At what point does the robot fall back to proprioceptive locomotion? How does it know if the data its sensors are getting are false or misleading? And how much does proprioceptive locomotion impact performance or capabilities?

Miki: We think the robot detects if the exteroception matches the proprioception through its feet contact or feet positions. If the map is correct, the feet get contact where the map suggests. Then the controller recognizes that the exteroception is correct and makes use of it. Once it experiences that the feet contact doesn’t match with the ground on the map, or the feet go below the map, it recognizes that exteroception is unreliable, and relies more on proprioception. We showed this in this supplementary video experiment:


Supplementary Robustness Evaluation

youtu.be

However, since we trained the neural network in an end-to-end manner, where the student policy just tries to follow the teacher’s action by trying to capture the necessary information in its belief state, we can only guess how it knows. In the initial approach, we were just directly inputting exteroception into the control policy. In this setup, the robot could walk over obstacles and stairs in the lab environment, but once we went outside, it failed due to mapping failures. Therefore, combining with proprioception was critical to achieve robustness.

How much are you constrained by the physical performance of the robot itself? If the robot were stronger or faster, would you be able to take advantage of that?

Miki: When we use reinforcement learning, the policy usually tries to use as much torque and speed as it is allowed to use. Therefore if the robot was stronger or faster, we think we could increase robustness further and overcome more challenging obstacles with faster speed.

What remains challenging, and what are you working on next?

Miki: Currently, we steered the robot manually for most of the experiments (except DARPA SubT Challenge). Adding more levels of autonomy is the next goal. As mentioned above, we want the robot to complete a difficult hike without human operators. Furthermore, there is big room for improvements in the locomotion capability of the robot. For “harsher” terrains, we want the robot to perceive the world in 3D and manifest richer behaviors such as jumping over stepping stones or crawling under overhanging obstacles, which is not possible with current 2.5D elevation map.

Article: spectrum.ieee.org

Continue Reading

Trending

Top100.biz