As seen on Dark Reading.
Today, many organizations are blind to the threats lurking within their API traffic. Even worse, organizations are forced to implement basic logging of its API traffic that doesn't contain the meaningful information about who accessed, what records were accessed or manipulated and how. There exists a justified fear of logging sensitive data or being out of compliance, and with the lack of technology that can perform it at scale, they prefer to log with low fidelity. Those logs tell you that "somebody modified or accessed a record" but typically don't disclose who accessed it, which record, or what action was performed.
This decision also results in a downstream issue of "insufficient logging", which is noted by the Open Web Application Security Project as one of the top security problems in its 2021 OWASP API Top 10. "Insufficient logging" is poor for incident forensics and, in practice, means that you can't detect abuse or investigate a case, even if you know it happened.
Tokenization is the process of substituting a sensitive data element, like a credit card number, for a non-sensitive equivalent that has no intrinsic or exploitable value or meaning. Neosec's automated tokenization is part of its 'privacy by design' philosophy and is already deployed successfully at customers around the world in financial services, insurance and hospitality companies among others.
The process allows retaining tokenized API activity data for the purposes of performing true behavioral analytics over time, ensures that sensitive data is never stored at rest, and enables only the customer to de-tokenize, based on the strictest data privacy practices.
"Solving API security starts with basic visibility and the ability to see how the APIs are used. The problem is that virtually every company logs API activity with low fidelity that doesn't enable this basic visibility" said Giora Engel, co-founder and chief executive officer, Neosec. "In order to perform true behavioral analytics and investigate cases you must store and examine historical data. But if this analysis is performed on un-tokenized data you risk storing PII and creating compliance issues. Neosec successfully retains all API activity data, in the highest fidelity, and ensures it meets data privacy standards."
This focus on data and the visibility it brings is what previously defined the creation of the EDR (Endpoint Detection & Response) security space. "Trying to implement API security without enabling basic visibility of activity is like going back to the antivirus age before the advent of EDR. Visibility into API activity allows you to detect threats, understand behavior, investigate and remediate" said Engel.
The Neosec API security solution discovers and maintains an up-to-date inventory of all APIs in use by an organization and then uses machine learning and behavioral analytics on tokenized data to find fraud and abuse by third parties and attackers. Neosec also enables proactive API threat hunting and investigations without storing any sensitive data.
The automated API data tokenization is now a capability of the Neosec platform and is fully available. There is no extra cost for use of this unique capability.