Iván Tajes knew from when he was very young that he wanted to be an engineer.
“I didn’t know what kind of engineer though,” Tajes says. “Before university, I was deciding between architecture and computer engineering. Finally, I decided on computers, but a part of me still misses that creative part of creating nice building designs. I think that is why I work on the frontend. Because I want to create beautiful things that people use.”
Tajes has been creating beautiful and highly useful things at Empathy.co since early 2017, and through his natural talent he became leader of the front-end development team last summer at Empathy’s headquarters in the north central Asturian city of Gijón.
Tajes manages day-to-day Empathy Interface implementations for multiple customers, as he strives to deliver the best frictionless search and discovery tools for each client. Tajes may work in Gijón, but he retains his native Galician accent from northwest Spain. We spoke with Tajes about the front-end team’s implementation of Empathy Interface. The tool itself (formerly called EmpathyX) is the outcome of years of expertise in building unique search relationships for e-commerce.
What makes Empathy’s search interface different? How has it evolved and adapted to time-to-market requirements?
Well, the special thing about Empathy Interface is its adaptability. Everything is configurable and interchangeable: Not only the appearance but also the behaviour. So we can adapt the project to any requirement of the customer in a relatively short time.
Empathy Interface is a plug-in tool which builds customer relationships that adapt with the user’s circumstances. More broadly, Empathy.co perceives Search as a joining of mind and body: the mindful intentions of the searcher are supported and linked by the physical feelings and sensations of the searcher. Complex functionalities support this need to both inform and evoke feeling. How do these complex functionalities in the front end accomplish this?
The Empathy Interface is the point of union of all our technologies, knowledge and functionalities. As you said, it’s the Body that connects all the parts with the Mind. All our APIs such as Search, Data, Contextualize, Empathize … all are connected and shown to the world through the Empathy Interface. And like Body and Mind, one cannot live without the other. In our team, we go above and beyond to honour all those working behind the scenes, working in the mind.
How do you implement the Interface for a customer? What are typical timeframes? What are the first deliverables?
The “setup” process, as we call it, takes by default a couple of weeks. The first week we can have a first version to start testing in the customer’s development environment. And the second week, we receive customer feedback, polish details and get it all ready to go live.
This is the standard process when all the requirements of the customer are already included in the “base” project as common features. If the customer requires something custom, something special, we can develop it without a problem, without breaking anything, but it will increase the timeframe of the process a little bit.
But in most of the cases, the base product is configurable enough to accomplish the customer requirements.
After the Interface is deployed, search journeys grow longer as users spend at least 200 percent more time interacting with the search box and its navigation. Yet the consequent site search conversion rates rise by 25 percent. How do you make that additional time joyful?
For the Empathy Interface team, our mission is to create products that provoke positive emotions in the users. When the search relationship is satisfactory, when the user is having fun, the time goes faster. We try to make the interaction with the interface fast, effortless – for instance, without the need to reload the whole web page in every query typed or filter selected. The quick response is very important. Also, we use animations and transitions to transmit the changes to the user and we do it fluidly, with no abrupt changes. All these elements make the experience more pleasant.
What elements of empathy do you create within the code itself? Do the algorithms mimic human feeling and human connection in some form?
Inside our architecture, we have something called “wiring.” This wiring comprises the connections between all the parts and is, in the end, what defines the whole behaviour. This may emulate the human nervous system, with all the ramifications of these connections.
How does Empathy Interface bring the body of search to the definition of what search is? How does search acquire qualities of level and strength?
The Search is the way to fill the gap between what a person wants and what actually is offered. And there are many ways to fill this gap. One is to offer relevant results to the user, of course, but also to guide the user through different searches to obtain what is relevant, or to show other journeys that the user didn’t know that he or she wanted to follow. We use both approaches in Empathy Interface, trying to allow for all the possibilities.
Conscious, explicit text and picture elements such as past purchases and trendy merchandise help to make a search interface sticky. What are some subconscious, implicit, and subjective visual elements of the interface that cue the shopper to feel a connection or relationship?
Of course, we have a history component, to show the recent queries of the user. But we go a step further. With the recent activity of the user (results clicked, query history, filters applied) we show a “wall” of results that can be interesting for the user. We call this “Discovery Wall” and it is shown just as the user opens the search. We want to evoke in the user a feeling of familiarity and personalization, as we try to guess the suitable results.
Also, we have suggestions of queries to offer to the user, even if it’s the first time in the site and the user didn’t type a query yet. And once the user searches, we offer possible Next Queries based on the current query. So the user can search without even typing in the search box. This makes the search journey easier, more effortless.
What is your vision for the Interface? What is its mission for the future, five years from now?
The future of Empathy Interface has two horizons or gravitational forces; on one side we envision an architectural future where Interface evolves from the notion of a product to that of an ecosystem, one that will allow everyone to create and integrate a search interface in an easy way without deep knowledge. This goal requires open source considerations to allow the community to contribute.
On the other side, there is a functional horizon, and here we envision with clarity a fundamental shift, from mere Search-Data personalisation (or contextualization as we prefer to call it) to Search-Interface personalisation. Think of it as an Interface that holds the capacity to express its form (structure and state), morphologically, adapting to moments and unlocking a whole world of infinite possibilities through the negotiation of perceptions while evoking the most powerful feeling; Understanding by visual Expression; a fundamental pillar of what it means to be human.