SapFix post 1

Facebook : ‘SapFix’ AI debugs your code automatically

Without any prior information or clue, Facebook has built and launched its all new artificial intelligence debug tool called ‘SapFix’. The tool is said to reduce the workload and time that software engineers spend on debugging. This also speeds up the process of rolling out new updates and software.

Facebook said that SapFix is capable of generating fixes for specific bugs and reporting them to the engineers for approval or modification.

Even right now, SapFix has been actively used to speed up the process of shipping stable code updates to millions of android devices using the Facebook app. Facebook has also said that it is now intended to share SapFix with the engineering community and seeing automatic debugging evolve to the next level.

Facebook also stated that SapFix operates as an individual tool and is able to run with our without, Sapienz, which is Facebook’s automated software testing tool.

The concept stated that SapFix is focused toward fixing bugs identified by Sapienz before they even reach production. When Sapienz along with Facebook’s Infer static analysis tool localize a point in code that requires a patch or fix, it is passed to SapFix, which automatically picks a strategy or strategies to generate a patch.

In cases of complex fixes, the tool harvests templates from those created by human engineers while fixing past bugs. Even if SapFix is able to generate a fix, its work is not over yet. The tool generates few other potential fixes and evaluates their quality for 3 issues – compilation errors, if the crash persists and if the new fix introduces new crashes.

If there are any new problems found, SapFix runs already existing developer-written tests and as well as Sapienz tests on the patched build.

When SapFix fully tests the patch, it then sends it to human review for approval.

Google AI post

Google’s Now Playing is now officially called Sound Search

Google introduced the low-power, always-on music recognition feature called Now Playing to the Pixel 2. While developing the Now Playing, the goal that Google primarily had was to create a feature that was small and at the same time, an efficient music recognizer. The feature used small audio fingerprints for each track in the database to recognize the playing song and that too without the need for an internet connection.

What Google later discovered is that not only Now Playing was an efficient feature for on-device music recognizing but it also exceeded the accuracy of their Sound Search feature which was their then-current-server-side system. To those who are unaware, Sound Search was built before the widespread use of deep neural networks. Google reportedly said to have already wondered about bringing the Now Playing feature to their server-side Sound Search to improve the capabilities of Google’s music recognition feature.

The recent version of Sound Search introduced by Google is said to have been powered by some of the same technology that was powering Now Playing. The new feature is easily accessible through the Google Search app or the Google Assistant on any Android smartphone. If music is playing near you, starting a voice query will bring up a pop-up suggestion stating “What’s this song?” which you can press. Otherwise, you can also alternatively ask, “Hey Google, what’s this song?” for a faster and more accurate results than before.

Google also claimed that with Now Playing, they are going to use machine learning to create compact audio fingerprints being able to run entirely on a phone. They also claimed that they already had a good audio fingerprinting system and those ideas when carried over to the server-sided Sound Search system proved to be well.

Google says that there is still room for improvement in the Sound Search feature and they are striving to make it even better in the future.

Pinterest

Pinterest Boasts a Massive 250 Million Active Monthly Users

Last year, September, Pinterest reported to have crossed 200 million active monthly users and it was great news to hear. The same month, this year, the image based social bookmarking site has reported to have reached a milestone of 250 million active monthly users, which again is a great feat.

While sharing the statistics, Pinterest also reported that 80 percent of new sign-ups are outside the USA. Also, 40% of the new sign-ups are men.

Another thing to know is the fact that Pinterest is now becoming an increasingly shoppable platform. Pinterest announced earlier this year that 87% of is users said to have bought a product because it was relevant to the content they were engaging with.

Along with increasing users, “Pins” are also growing at the same time. Currently, there are about 175 billion of them which is an increase of about 75%.

Since the community has been growing at a rapid pace, the company has also begun a hiring more employees at a faster rate. The company has recently hired over 1500 new employees which is a 32% increase from the previous year.

Seeing that 87% of its users do buy things suggests that the visual-based search giant, Pinterest has now become critical to retails’ online sales and marketing strategies. Not only has the platform seen a rise in its active user numbers but consumers are also spending more money while purchasing something from the platform.

The company has also been making efforts to partner with major retail companies. Earlier in March, Pinterest forged a partnership with The Home Depot to increase the social network’s discovery feature called Shop the Look. Similarly, Target has also partnered with Pinterest for visual search.

App Development

Facebook Restore Deleted Cross-Posts After Deletion of Twitter App

Recently, Twitter requested the deletion of its app from Facebook as it became useless when Facebook made a move to remove a feature that allowed people to cross-post tweets and updates from Twitter, Axios and etc.

However, we did not know that just by deleting the app from Facebook could result in deletion of any old posts from users. When this accident occurred, Facebook immediately got in touch with Twitter to get the issue solved. However, it was only the next morning when Facebook made an announcement that it is restoring the deleted content for users.

The cross-posting feature was a really great feature. Many users relied on this feature to maintain their presence on Facebook and continue their discussion with new audience. This deletion meant that users who used cross-posting to engage with their audience had lost years’ worth of conversations, in many cases.

When the issue was resolved and the posts had been restored, Facebook announced that the deleted content was restored and all impacted users can now view their posts and tweets.

However, this deletion was temporary but it rings an alarm in our heads that users are not always in control of the content that they create on social media platforms.

Mobile App Development

Alexa and Google Home won’t take away your regional accent

A recent survey by Life Science Centre in New Castle claimed that about 79% visitors who attended their “Robots – Then and Now” exhibition had to alter the way they spoke in order to be understood by the voice assistants, such as Alexa, Google Assistant, Siri, and Cortana.

These visitors were also reported to having regional British accent and their need to shift their accent in order to communicate with the voice assistants effectively.

It is important to understand that people with non-standard accent will have a harder time to communicate while interacting with smart assistant devices. Another issue people face while interacting with smart speakers is that speech technology seems to favor standard accents.

Even as humans we sometimes struggle to recognize where people are from based upon their accents alone and virtual assistant devices will also take time to gain that familiarity with regional accents.

People are worried that these virtual assistants will take away your regional assistant but we have to be realistic here. Interactions between humans and robots are not that frequent when compared to human-to-human interactions. Since our everyday talk is designed to communicate with other humans, a small change in the way we pronounce or our accents will not change our accents drastically. The only thing we might find ourselves doing more while interacting with virtual devices are raising or lowering our voices or enunciating more.

So, we have to stop fretting now. If you see the journey of virtual assistants to produce language, you can see that they have come a long way already and they will only improve going forward and possibly be able to understand regional accents.

Microsoft’s OneDrive

Audio and Video Transcription Support Coming to Microsoft’s OneDrive

Microsoft on Tuesday announced that the tech giant is introducing new capabilities to OneDrive over the next few weeks and months. One of the most interesting features that Microsoft announced was the native audio and video transcription support. What this means is that the cloud storage service will automatically transcribe the audio and video files that the user uploads.

However, the audio and video transcription capability is limited to the OneDrive for Business. In order to power this automated transcription capability, Microsoft is using the same AI that’s used in Microsoft Stream, which is a video-sharing service for corporate customers.

Whenever a viewer views a video or listens to an audio, a full transcript of the file will be shown in the OneDrive viewer. The viewer supports more that 320 different file types. This feature will also help users search for media based on the content that’s within them.

There is no official launch date for the functionality to be available in OneDrive for Business. We are also not sure if Microsoft will bring this functionality to all OneDrive users.