![]() ![]() Snapchat also released a machine vision-powered search feature last year that compiles Stories of user-submitted Snaps featuring your chosen keyword, like videos with “puppies” or “fireworks,” even if the captions don’t mention them. In mid-2017 Snapchat launched World Lenses that map the surfaces of your surroundings so you can place 3D animated objects like its Dancing Hotdog mascot alongside real people in real places. And Snap already has in-app shopping.īeing able to recognize what you’re seeing makes Snapchat more fun, but it’s also a new way of navigating reality. It’s easy to imagine context cards being accessible for products tagged in Snap Ads as well as scanned through visual search. Surfacing within Snapchat a context card of details about ownable objects might be the first step to getting users to buy them… and advertisers to pay Snap to promote them. In characteristic cool kid teenspeak, an error message for “product not found” reads “Bummer, we didn’t catch that!”Įagle’s visual search may be connected to Snapchat’s “context cards,” which debuted late last year and pull up business contact info, restaurant reservations, movie tickets, Ubers or Lyfts and more. Snapchat’s code doesn’t explain exactly how the Project Eagle feature will work, but in the newest version of Snapchat it was renamed as “Camera Search.” The code lists the ability to surface “sellers” and “reviews,” “Copy URL” of a product and “Share” or “Send Product” to friends - likely via Snap messages or Snapchat Stories. Project Eagle builds on this audio search feature to offer visual search through a similar interface and set of partnerships. The ability to identify purchaseable objects or scan barcodes could turn Snapchat, which some view as a teen toy, into more of a utility. Snapchat first dabbled in understanding the world around you with its Shazam integration back in 2016 that lets you tap and hold to identify a song playing nearby, check it out on Shazam, send it to a friend or follow the artist on Snapchat. ![]() gave TechCrunch a “no comment,” about visual search but the company’s code tells the story. Snap already sells its Spetacles v2 camera glasses on Amazon - the only place beyond its own site. Amazon didn’t respond to a press inquiry before publishing time, and it’s unclear if its actively involved in the development of Snapchat visual search or just a destination for its results. His tips have previously led to TechCrunch scoops about Instagram’s video calling, soundtracks, Focus portrait mode and QR Nametags features that were all later officially launched. TechCrunch was tipped off to the hidden Snapchat code by app researcher Ishan Agarwal. direly needs after posting a $385 million loss last quarter and missing revenue estimates by $14 million. And if Snapchat has worked out an affiliate referrals deal with Amazon, it could open a new revenue stream. It could differentiate Snapchat from Instagram, whose clone of Snapchat Stories now has more than twice the users and a six times faster growth rate than the original. Visual product search could make Snapchat’s camera a more general purpose tool for seeing and navigating the world, rather than just a social media maker. Buried inside the code of Snapchat’s Android app is an unreleased “Visual Search” feature where you “Press and hold to identify an object, song, barcode, and more! This works by sending data to Amazon, Shazam, and other partners.” Once an object or barcode has been scanned you can “See all results at Amazon.” ![]() Codenamed “Eagle,” Snapchat is building a visual product search feature that delivers users to Amazon’s listings. ![]()
0 Comments
Leave a Reply. |