Back to Blog
tutorialnode.jstwitter apijavascript

How to Get Twitter Data with Node.js (No Official API Needed)

Step-by-step tutorial for fetching tweets, user profiles, and followers using Node.js. Working code examples with a third-party Twitter API — no developer account required.

TweetAPI Team
February 17, 2026
10 min read

Every Twitter API tutorial out there is written in Python. Go search "twitter api tutorial" and count how many use requests and pandas. It's basically all Python.

So if you're a Node.js developer, good luck finding something you can actually copy-paste.

This guide is the Node.js version of that tutorial nobody wrote. We'll fetch user profiles, pull tweets, search content, handle pagination, and build a real analytics script. All using native fetch, no external HTTP libraries. Instead of the official X API (which requires a developer account and costs $200/month minimum), we're using TweetAPI, a third-party alternative with instant access.

Disclaimer: TweetAPI isn't affiliated with X Corp or the official X platform.

What You Need

  • Node.js 18+ (we're using the built-in fetch, so zero HTTP dependencies)
  • A TweetAPI account (sign up here, free, no credit card, 100 trial requests)
  • Basic JS knowledge (async/await, JSON, template literals)

Authentication is just an API key in a header. No OAuth, no bearer tokens.

Project Setup

Get Your API Key

Sign up at tweetapi.com, go to the dashboard, create a key. Copy it and keep it safe.

Scaffold the Project

mkdir twitter-nodejs && cd twitter-nodejs
npm init -y

Create a .env file:

TWEETAPI_KEY=your_api_key_here

And your main file. We're using .mjs so top-level await works without any extra config:

// twitter.mjs
const API_KEY = process.env.TWEETAPI_KEY;
const BASE_URL = "https://api.tweetapi.com/tw-v2";

if (!API_KEY) {
  console.error("Missing TWEETAPI_KEY environment variable");
  process.exit(1);
}

Run it with:

TWEETAPI_KEY=your_key node twitter.mjs

Or if you're on Node 20.6+:

node --env-file=.env twitter.mjs

Fetching a User Profile

Simplest possible call. Look up a user by username:

async function getUser(username) {
  const url = `${BASE_URL}/user/by-username?username=${username}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) {
    throw new Error(`API error: ${res.status} ${res.statusText}`);
  }

  return res.json();
}

// Usage
const user = await getUser("elonmusk");
console.log(user.data.name);         // "Elon Musk"
console.log(user.data.followerCount); // 226995885
console.log(user.data.bio);          // Profile bio text

You get back everything on a profile: name, bio, avatar URL, banner, follower/following counts, tweet count, verification status, creation date. One request.

Bulk Lookup

If you need several profiles, don't loop. Use by-usernames to grab up to 200 in a single call:

async function getUsers(usernames) {
  const url = `${BASE_URL}/user/by-usernames?usernames=${usernames.join(",")}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) {
    throw new Error(`API error: ${res.status}`);
  }

  return res.json();
}

const users = await getUsers(["nodejs", "vercel", "nextjs"]);
for (const user of users.data) {
  console.log(`${user.username}: ${user.followerCount} followers`);
}

One API call instead of three. Makes a real difference when you're monitoring 50+ accounts.

Fetching a User's Tweets

The profile response gives you a user.data.id. That's what you pass to the tweets endpoint. Important: usernames won't work here, it has to be the numeric ID.

async function getUserTweets(userId, cursor) {
  const params = new URLSearchParams({ userId });
  if (cursor) params.set("cursor", cursor);

  const url = `${BASE_URL}/user/tweets?${params}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) {
    throw new Error(`API error: ${res.status}`);
  }

  return res.json();
}

const user = await getUser("elonmusk");
const tweets = await getUserTweets(user.data.id);

for (const tweet of tweets.data) {
  console.log(`[${tweet.type}] ${tweet.text.slice(0, 80)}...`);
  console.log(`  Likes: ${tweet.likeCount} | Retweets: ${tweet.retweetCount}`);
}

Each tweet comes with the full text, engagement numbers (likes, retweets, replies, views), any media attachments, a timestamp, and a type field: "tweet", "retweet", "quote", or "reply". That type field is handy for filtering out retweets when you only care about original content.

Searching Tweets

The search endpoint works like Twitter's search bar. Same operators, same syntax:

async function searchTweets(query, type = "Latest", cursor) {
  const params = new URLSearchParams({ query, type });
  if (cursor) params.set("cursor", cursor);

  const url = `${BASE_URL}/search?${params}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) {
    throw new Error(`API error: ${res.status}`);
  }

  return res.json();
}

// Search for recent tweets about Node.js
const results = await searchTweets("node.js API", "Latest");
for (const tweet of results.data) {
  console.log(`@${tweet.author.username}: ${tweet.text.slice(0, 100)}`);
}

The type parameter controls what comes back:

  • "Latest" — chronological, most recent first
  • "Top" — sorted by engagement/relevance
  • "People" — user profiles instead of tweets
  • "Photos" — only tweets with images
  • "Videos" — only tweets with video

Note the Title-case. "Latest" not "latest". The API is strict about this.

All the standard Twitter search operators work: from:elonmusk, #javascript, "exact phrase", min_faves:100, and so on.

Handling Pagination

Every endpoint that returns a list is paginated. The API sends back a pagination object with a nextCursor. Pass that as cursor in your next request to get the next page:

async function getAllTweets(userId, maxPages = 5) {
  const allTweets = [];
  let cursor = undefined;

  for (let page = 0; page < maxPages; page++) {
    const response = await getUserTweets(userId, cursor);
    allTweets.push(...response.data);

    cursor = response.pagination?.nextCursor;
    if (!cursor) break; // No more pages

    console.log(`Fetched page ${page + 1} (${allTweets.length} tweets total)`);
  }

  return allTweets;
}

const user = await getUser("nodejs");
const tweets = await getAllTweets(user.data.id, 3);
console.log(`Got ${tweets.length} tweets from @nodejs`);

If you're paginating through a lot of pages, add a short delay between requests. Something like await new Promise(r => setTimeout(r, 500)) after each page keeps you within rate limits.

Getting Followers

Same idea as tweets. Pass a user ID, get back a paginated list. But instead of tweet objects, you get full user profiles for each follower:

async function getFollowers(userId, cursor) {
  const params = new URLSearchParams({ userId });
  if (cursor) params.set("cursor", cursor);

  const url = `${BASE_URL}/user/followers?${params}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) {
    throw new Error(`API error: ${res.status}`);
  }

  return res.json();
}

const user = await getUser("nodejs");
const followers = await getFollowers(user.data.id);

for (const follower of followers.data) {
  console.log(`@${follower.username}${follower.followerCount} followers`);
}

Since each follower comes back as a full user object (bio, follower count, everything), you can do audience analysis without extra requests. Want to know what percentage of someone's followers have 10K+ followers? You already have the data.

Putting It All Together: Account Analytics

Here's a complete script that takes a username from the command line and prints an analytics breakdown:

// analytics.mjs
const API_KEY = process.env.TWEETAPI_KEY;
const BASE_URL = "https://api.tweetapi.com/tw-v2";

async function apiGet(path, params = {}) {
  const url = `${BASE_URL}${path}?${new URLSearchParams(params)}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });
  if (!res.ok) throw new Error(`API error: ${res.status}`);
  return res.json();
}

async function analyzeAccount(username) {
  // 1. Get profile
  const profile = await apiGet("/user/by-username", { username });
  const user = profile.data;

  console.log(`\n--- @${user.username} ---`);
  console.log(`Name: ${user.name}`);
  console.log(`Followers: ${user.followerCount.toLocaleString()}`);
  console.log(`Following: ${user.followingCount.toLocaleString()}`);
  console.log(`Tweets: ${user.tweetCount.toLocaleString()}`);
  console.log(`Joined: ${new Date(user.createdAt).toLocaleDateString()}`);

  // 2. Get recent tweets
  const tweets = await apiGet("/user/tweets", { userId: user.id });

  // 3. Calculate engagement stats
  const originalTweets = tweets.data.filter((t) => t.type === "tweet");
  const retweets = tweets.data.filter((t) => t.type === "retweet");
  const replies = tweets.data.filter((t) => t.type === "reply");

  console.log(`\n--- Recent Activity ---`);
  console.log(`Original tweets: ${originalTweets.length}`);
  console.log(`Retweets: ${retweets.length}`);
  console.log(`Replies: ${replies.length}`);

  if (originalTweets.length > 0) {
    const avgLikes =
      originalTweets.reduce((sum, t) => sum + t.likeCount, 0) /
      originalTweets.length;
    const avgRetweets =
      originalTweets.reduce((sum, t) => sum + t.retweetCount, 0) /
      originalTweets.length;
    const avgViews =
      originalTweets.reduce((sum, t) => sum + (t.viewCount || 0), 0) /
      originalTweets.length;

    console.log(`\n--- Avg. Engagement (original tweets) ---`);
    console.log(`Likes: ${Math.round(avgLikes).toLocaleString()}`);
    console.log(`Retweets: ${Math.round(avgRetweets).toLocaleString()}`);
    console.log(`Views: ${Math.round(avgViews).toLocaleString()}`);

    // 4. Find top tweet
    const topTweet = originalTweets.sort(
      (a, b) => b.likeCount - a.likeCount
    )[0];
    console.log(`\n--- Top Tweet ---`);
    console.log(`"${topTweet.text.slice(0, 140)}..."`);
    console.log(
      `Likes: ${topTweet.likeCount.toLocaleString()} | Views: ${(topTweet.viewCount || 0).toLocaleString()}`
    );
  }
}

const username = process.argv[2] || "nodejs";
analyzeAccount(username);

Run it:

node --env-file=.env analytics.mjs vercel

Two API calls. Profile plus one page of tweets gives you follower stats, posting patterns, average engagement, and the top-performing tweet. On the Pro plan ($17/month, 100K requests), that's 50,000 account analyses before you hit your limit.

Handling Errors

The code above works for happy paths. In production, you'll want to handle rate limits and bad responses:

async function apiGet(path, params = {}) {
  const url = `${BASE_URL}${path}?${new URLSearchParams(params)}`;
  const res = await fetch(url, {
    headers: { "X-API-Key": API_KEY },
  });

  if (res.status === 429) {
    // Rate limited — wait and retry
    const retryAfter = res.headers.get("retry-after") || 60;
    console.log(`Rate limited. Waiting ${retryAfter}s...`);
    await new Promise((r) => setTimeout(r, retryAfter * 1000));
    return apiGet(path, params); // Retry once
  }

  if (res.status === 401) {
    throw new Error("Invalid API key. Check your TWEETAPI_KEY.");
  }

  if (res.status === 404) {
    throw new Error(`Not found: ${path} with params ${JSON.stringify(params)}`);
  }

  if (!res.ok) {
    const body = await res.text();
    throw new Error(`API error ${res.status}: ${body}`);
  }

  return res.json();
}

The main thing you'll hit in practice is the 429 (rate limit). The retry logic here waits and tries once. In a real app you'd probably want exponential backoff or a proper queue instead.

Rate limits by plan:

PlanMonthly RequestsPer Minute
Free Trial100 total10/min
Pro ($17/mo)100,00060/min
Ultra ($57/mo)500,000120/min
Mega ($197/mo)2,000,000180/min

60 requests/minute on Pro is enough for most things. You'd only need to upgrade if you're doing bulk collection across hundreds of accounts in parallel.

Where to Go From Here

Every endpoint in TweetAPI follows the same pattern: GET request, X-API-Key header, JSON response. Once you've got the helper function, adding new endpoints is just changing the path and params.

People use this stuff to build keyword monitors that ping Slack on a cron schedule, engagement trackers that chart metrics over time in Postgres, competitor research tools, sentiment analysis pipelines feeding into GPT. The patterns are all the same as what's in this tutorial.

The full API docs cover every endpoint, including DMs, posting, following, list management, and communities.

FAQ

Do I need an official Twitter developer account?

No. TweetAPI is a third-party API. Sign up, get a key, start making requests. No application process.

Will this work with Express / Next.js / serverless?

Yes. It's standard fetch, works anywhere Node.js 18+ runs. Drop the functions into your route handlers, API routes, Lambda functions, whatever.

What about TypeScript?

Works as-is. TweetAPI responses have consistent shapes, so you can write interfaces for User, Tweet, etc. once and get full type safety across your app.

Is fetch available in my Node version?

It shipped in Node 18 and became stable in Node 21. If you're on an older version, node-fetch or undici are drop-in replacements. The rest of the code stays identical.

How much does this cost?

Each function call in this tutorial = 1 API request. The analytics script uses 2 per run. On the Pro plan (100K/month at $17), most developers never come close to the ceiling.

Can I post tweets, like, follow?

Yes. TweetAPI has interaction endpoints for all of that. Write operations need an extra authToken parameter (your Twitter auth token). Check the interaction docs for specifics.

What if I get rate limited?

You get a 429 response. Wait for the window to reset (usually 60 seconds), then retry. The error handling section above shows how to handle this. If you're consistently hitting limits, upgrade a tier.

Next Steps

  1. Try the free trial at tweetapi.com, 100 requests, no credit card
  2. Check the full docs at /docs, examples for every endpoint
  3. See pricing at /#pricing, plans from $17/month

Related posts:


TweetAPI isn't affiliated with or endorsed by X Corp. This tutorial is for educational purposes. Make sure your use of Twitter data complies with applicable laws and terms of service.