NeXN Assistant

Developing a voice activated artificial intelligent assistant that is modular, very smart and does everyday tasks including taking pictures
Startup name
NeXN Assistant
Founding date
June 2016
Total team member
Startup stage
Consumer, Hardware, Software
Consumer electronics, Industrial internet & IoT
Product usage or interact
Mobile/Tablet (iOS), Mobile/Tablet (Android), Hardware-non-wearable
Lagos, Lagos, Nigeria

Elevator pitch

Problem with Speech systems is how limited they can be however with smart homes coming of age things are changing. This product takes usefulness to the next level by being modular and adaptable so it can be carried and used in more environments (cars, home, shopping) where it's easier to talk and give commands rather than use a screen interface. It also includes body tracking for picture taking.


Olabosipo Shoroye

3+ years experience developing mobile applications with some success. 5+ years full stack developer ( MSc Advanced Engineering and Management. Very tenacious and now developing in speech recognition software space, worked as marketing intern 2014 U.K

Business model

Target customer

Smartphone users

Customer acquisition strategy

Online marketing channels & IndieGoGo

Revenue model

Similar to an iPhone model, customers will pay for hardware only, company will handle software costs however there will be future upgrades to tempt customers to buy again

Market info

Market size

$ 431,000,000,000


Echo has a 70% market share, Google roughly the rest, Apple to enter the market soon. These competitors focus only on home/office hardware and not being modular to be carried and used about. None do body tracking for picture taking. Both are in a few countries (US, UK, Germany) with a vast majority of other countries up for grabs.

Startup traction


Add comment

The product adapts to its environment i.e. can give directions with LED while driving, can play songs from an Mp3 module while shopping. Very talented Speech experts to be added to the team before launch.

Publication date: 02 August 2017