Safelyx JavaScript SDK is now available in NPM and JSR

Safelyx JavaScript SDK is now available in NPM and JSR. You can now install Safelyx in your project using NPM or JSR.

We're excited to announce the release of our JavaScript SDK for automated content moderation! The @safelyx/api package is now available on both NPM and JSR (and deno.land/x), making it easier than ever to integrate Safelyx's powerful platform moderation tools into your JavaScript and TypeScript projects. You can also find its source code on GitHub (it's Open Source!).

Simplicity Across All JavaScript Environments

At Safelyx, we believe in making content moderation simple and accessible. We understand that "simple" means different things to different developers and platforms, which is why we've created a JavaScript SDK that works seamlessly across all modern JavaScript environments:

  • Node.js
  • Deno
  • Bun
  • Web Workers
  • Service Workers
  • Cloudflare Workers
  • Browser
  • React Native
  • And anything with a standard fetch API!

Quick Start with NPM

For Node.js projects, getting started with our automated content moderation SDK is as simple as running:

npm install --save-exact @safelyx/api

Then in your code:

const safelyx = require('@safelyx/api');

async function checkContentSafety() {
  const checkResult = await safelyx.checkLink('https://example.com');
  if (checkResult.result >= 8) {
    console.log('Link is safe!');
  } else if (checkResult.result >= 4) {
    console.log('Link needs review');
  } else if (checkResult.result >= 0) {
    console.log('Link is unsafe!');
  }
}

Using with Deno/Bun and JSR

For Deno or Bun projects, you can import our SDK directly from JSR or Deno.land/x:

import safelyx from 'jsr:@safelyx/api@0.1.0';

const checkResult = await safelyx.checkLink('https://example.com');
if (checkResult.result >= 8) {
  console.log('Link is safe!');
} else if (checkResult.result >= 4) {
  console.log('Link needs review');
} else if (checkResult.result >= 0) {
  console.log('Link is unsafe!');
}

Comprehensive Platform Moderation Features

Our SDK provides access to all of Safelyx's content moderation endpoints:

  • Link Safety Verification: Protect users from malicious URLs and phishing attempts
  • Message Content Analysis: Detect inappropriate content, sentiment, and potential threats
  • Email Legitimacy Validation: Verify email addresses and prevent abuse
  • Image Safety Analysis: Ensure uploaded images meet your platform's standards

Real-world Implementation Examples

Looking at our documentation and use cases, you can see how the SDK helps with various content moderation scenarios:

  • Platform content moderation
  • User-generated content protection
  • Community safety enforcement
  • Real-time threat detection

Simple Integration, Powerful Results

The SDK returns detailed analysis results while maintaining a simple API interface. Each check returns a comprehensive result object that includes:

  • Safety score (0-10)
  • Detailed analysis
  • Content sentiment
  • Threat assessment
  • Recommended actions

Getting Started

  1. Install the package from NPM or import from JSR
  2. Buy a key code
  3. Start protecting your platform with just a few lines of code

Our SDK is designed to grow with your platform, offering:

  • Detailed safety analysis
  • Real-time threat detection
  • Comprehensive reporting
  • Flexible integration options
  • Cross-platform compatibility

Try It Today

Ready to enhance your platform's content moderation? Get started with our SDK today:

  • NPM: npm install --save-exact @safelyx/api
  • JSR: import safelyx from 'jsr:@safelyx/api@0.1.0'
  • Documentation: Safe API Documentation

Join the growing number of platforms using Safelyx for automated content moderation. Our SDK makes it easier than ever to protect your users while maintaining a positive user experience.

Questions or need help? Contact our support team at help@safelyx.com.