All posts by webammer_anand

The Node.js Update - #Week 41 - 12 October, 2018

By Tamas Kadlecsik

The Node.js Update - #Week 41 - 12 October, 2018

Below you can find RisingStack‘s collection of the most important Node.js updates, tutorials & announcements from this week.

Node.js v10.12.0 (Current) Released. Changes:

  • assert: The diff output is now a tiny bit improved by sorting object properties when inspecting the values that are compared with each other.

  • cli:

    • The options parser now normalizes _ to - in all multi-word command-line flags, e.g. --no_warnings has the same effect as --no-warnings.
    • Added bash completion for the node binary. To generate a bash completion script, run node --completion-bash. The output can be saved to a file which can be sourced to enable completion.
  • crypto:

    • Added support for PEM-level encryption.
    • Added an API asymmetric key pair generation. The new methods crypto.generateKeyPair and crypto.generateKeyPairSync can be used to generate public and private key pairs. The API supports RSA, DSA and EC and a variety of key encodings (both PEM and DER).
  • fs: Added a recursive option to fs.mkdir and fs.mkdirSync. If this option is set to true, non-existing parent folders will be automatically created.

  • http2:

    • Added a 'ping' event to Http2Session that is emitted whenever a non-ack PING is received.
    • Added support for the ORIGIN frame.
    • Updated nghttp2 to 1.34.0. This adds RFC 8441 extended connect protocol support to allow use of WebSockets over HTTP/2.
  • module: Added module.createRequireFromPath(filename). This new method can be used to create a custom require function that will resolve modules relative to the filename path.

  • process: Added a 'multipleResolves' process event that is emitted whenever a Promise is attempted to be resolved multiple times, e.g. if the resolve and reject functions are both called in a Promise executor.

  • url: Added url.fileURLToPath(url) and rl.pathToFileURL(path). These methods can be used to correctly convert between file: URLs and absolute paths.

  • util:

    • Added the sorted option to util.inspect(). If set to true, all properties of an object and Set and Map entries will be sorted in the returned string. If set to a function, it is used as a compare function.
    • The util.inspect.custom symbol is now defined in the global symbol registry as Symbol.for('nodejs.util.inspect.custom').
    • Added support for BigInt numbers in util.format().
  • V8 API: A number of V8 C++ APIs have been marked as deprecated since they have been removed in the upstream repository. Replacement APIs are added where necessary.

  • Windows: The Windows msi installer now provides an option to automatically install the tools required to build native modules.

  • Workers:

    • Debugging support for Workers using the DevTools protocol has been implemented.
    • The public inspector module is now enabled in Workers.

Node.js Security Handbook by Sqreen

Improve the security of your Node.js app with the Node.js Security Handbook made for developers.

The Node.js Update - #Week 41 - 12 October, 2018

Common Node8 mistakes in Lambda

That’s it, 5 common mistakes to avoid when working with Node.js 8.10 in Lambda. Including:

  • Still using callbacks
  • Not using promisfy
  • Too sequential
  • async/await inside forEach()
  • Not using AWSSDK’s .promise()

Using Node.js to Read Really, Really Large Datasets & Files (Pt 1)

“In the end, Node.js’s pure file and big data handling functions fell a little short of what I needed, but with just one extra NPM package, EventStream, I was able to parse through a massive dataset without crashing the Node server.”

The Node.js Update - #Week 41 - 12 October, 2018

JWT authentication: When and how to use it

Learn when JWT when is best used, when it’s best to use something else, and how to prevent the most basic security issues.

The Node.js Update - #Week 41 - 12 October, 2018

For more Node.js content, follow us on Twitter @RisingStack.

In case you need guidance with Docker, Kubernetes, Microservices or Node.js, feel free to ping us at info@risingstack.com!

Source:: risingstack.com

Touring What's New in the Angular CLI

By John Papa

Recently I presented “Touring What’s New in the Angular CLI” at Angular Mix. A lot has changed or been added in v6 and the forthcoming v7. You can check out the description below and the slides, as well.

While you were sleeping, the Angular CLI had a makeover. There are new features and several improvements to the features you already love. We’ll walk through a tour of the best new features and changes including updating your angular app to a new version, adding other libraries, and creating Angular libraries. We’ll also run through building an app from scratch using the CLI. Whether you are new to the Angular CLI or it’s a familiar old friend to you, this session will arm you with the knowledge to take full advantage of the best way to create and maintain Angular apps.

Source:: johnpapa

Laravel Eloquent: API Resources

By Chris Ganga

Introduction

When creating API’s, we sometimes specifying the data they want back in the various controller actions:

public function show(Book $book)
{
    return response()->json([
        'data' => [
            'title' => $book->title,
            'description' => $book->description,
            'author' => $book->author->name
        ]
    ]);
}

Notice we omitted the attributes created_at and updated_at when formatting the response?

Take another scenario where we want to update a book and expect a response back.

public function update(Request $request, Book $book)
{
    $book = $book->update($response->all());
    return response()->json([
        'data' => [
            'title' => $book->title,
            'description' => $book->description,
        ]
    ]);
}

We still have to format the response for the store() method and probably return the created book as part of the response.

If we the same $book attributes returned in all responses that involve a book resource, there is a high likelihood of forgetting some attribute, especially when working with many attributes. Again having to keep track of these attributes in every controller action that involves a book resource is a hassle. Assuming we have a book component on the frontend that is reused in all the occasions that involve a book resource. We are going to run into a problem when one of the attributes is missing.

To handle inconsistencies in API resource responses, We may update the model with what should be returned when we call a model instance.

Still working with the book example:

 Book Model
protected $hidden = ['created_at', 'updated_at'];

This means the created_at and updated_at attributes won’t be part of the response everytime we call on book resource.

But then again, how do we go about adding custom attributes that are not part of the original model:

protected $appends = ['date_stored'];

public function getDateStored()
{
    return (string) $this->created_at->diffForHumans();
}

And that’s just one custom attribute. Had we wanted to include many attributes to be part of the model’s response, trust me we are going to end up with one bloated model. It’s a lot easier to create a dedicated resource to respond with exactly the data one needs.

Fractal

Before Laravel 5.5, Fractal, a third party package was the tool most developers used to format API responses. Fractal provides a presentation and transformation layer for complex data output, the likes found in RESTful APIs, and works really well with JSON. Think of this as a view layer for your JSON/YAML/etc. Fractal encourages good API design, and responses will be consistent across the API.

Introducing Laravel API resources

As of Laravel 5.5, Laravel has the capabilities Fractal offered with very little configuration. Setting up Fractal was a bit of a process – require the package, register service providers, create transform classes and so forth.

With API resources, developers can easily specify the data they want to be returned per model basis without having to update models or even specifying the attributes they want to be part of the response in the various controller methods.

API resources provide a uniform interface that can be used anywhere in the app. Eloquent relationships are also taken care of.

Laravel provides two artisan commands for generating resources and collections – don’t worry about the difference between the two yet, we’ll get there in a bit. But for both resources and collections, we have our response wrapped in a data attribute; a JSON response standard.

We’ll look at how to work with API resources in the next section by playing around with a small project.

Prerequisites:

To follow along in this artilce, you need to have the following prerequisites

  • Basic Laravel knowledge

  • A working Laravel development environment. Note, the project is built on Laravel 5.6 which requires PHP >= 7.1.3.

Songs API Demo

Clone this repo and follow the instructions in the README.md to get things up and running.

With the project setup, we can now start getting our hands dirty. Also, since this is a very small project, we won’t be creating any controllers and will instead test out responses inside route closures.

Let’s start by generating a SongResource class:

php artisan make:resource SongResource

If we peek inside the newly created resource file i.e. SongResource (Resouce files usually go inside the AppHttpResources folder), the contents look like this:

[...]
class SongResource extends JsonResource
{
    /_*
     _ Transform the resource into an array.
     _
     _ @param  IlluminateHttpRequest  $request
     _ @return array
     _/
    public function toArray($request)
    {
        return parent::toArray($request);
    }
}

By default, we have parent::toArray($request) inside the toArray() method. If we leave things at this, all visible model attributes will be part of our response. To tailor the response, we specify the attributes we want to be converted to JSON inside this toArray() method.

Let’s update the toArray() method to match the snippet below:

public function toArray($request)
{
    return [
        'id' => $this->id,
        'title' => $this->title,
        'rating' => $this->rating,
    ];
}

As you can see, we can access the model properties directly from the $this variable because a resource class automatically allows method access down to the underlying model.

Let’s now update the routes/api.php with the snippet below:

# routes/api.php

[...]
use AppHttpResourcesSongResource;
use AppSong;
[...]

Route::get('/songs/{song}', function(Song $song) {
    return new SongResource($song);
});

Route::get('/songs', function() {
    return new SongResource(Song::all());
});

If we visit the URL /api/songs/1, we’ll see a JSON response containing the key-value pairs we specified in the SongResource class for the song with an id of 1:

{
  data: {
    id: 1,
    title: "Mouse.",
    rating: 3
  }
}

However, if we try visiting the URL /api/songs, an Exception is thrown Property [id] does not exist on this collection instance.

This is because instantiating the SongResource class requires a resource instance be passed to the constructor and not a collection. That’s why the exception is thrown.

If we wanted a collection returned instead of a single resource, there is a static collection() method that can be called on a Resource class passing in a collection as the argument. Let’s update our songs route closure to this:

Route::get('/songs', function() {
    return SongResource::collection(Song::all());
});

Visiting the /api/songs URL again will give us a JSON response containing all the songs.

{
  data: [{
      id: 1,
      title: "Mouse.",
      rating: 3
    },
    {
      id: 2,
      title: "I'll.",
      rating: 0
    }
  ]
}

Resources work just fine when returning a single resource or even a collection but have limitations if we want to include metadata in the response. That’s where Collections come to our rescue.

To generate a collection class, we run:

php artisan make:resource SongsCollection

The main difference between a JSON resource and a JSON collection is that a resource extends the JsonResource class and expects a single resource to be passed when being instantiated while a collection extends the ResourceCollection class and expects a collection as the argument when being instantiated.

Back to the metadata bit. Assuming we wanted some metadata such as the total song count to be part of the response, here’s how to go about it when working with the ResourceCollection class:

class SongsCollection extends ResourceCollection
{
    public function toArray($request)
    {
        return [
            'data' => $this->collection,
            'meta' => ['song_count' => $this->collection->count()],
        ];
    }
}

If we update our /api/songs route closure to this:

[...]
use AppHttpResourcesSongsCollection;
[...]
Route::get('/songs', function() {
    return new SongsCollection(Song::all());
});

And visit the URL /api/songs, we now see all the songs inside the data attribute as well as the total count inside the meta bit:

{
  data: [{
      id: 1,
      title: "Mouse.",
      artist: "Carlos Streich",
      rating: 3,
      created_at: "2018-09-13 15:43:42",
      updated_at: "2018-09-13 15:43:42"
    },
    {
      id: 2,
      title: "I'll.",
      artist: "Kelton Nikolaus",
      rating: 0,
      created_at: "2018-09-13 15:43:42",
      updated_at: "2018-09-13 15:43:42"
    },
    {
      id: 3,
      title: "Gryphon.",
      artist: "Tristin Veum",
      rating: 3,
      created_at: "2018-09-13 15:43:42",
      updated_at: "2018-09-13 15:43:42"
    }
  ],
  meta: {
    song_count: 3
  }
}

But we have a problem, each song inside the data attribute is not formatted to the specification we defined earlier inside the SongResource and instead has all attributes.

To fix this, inside the toArray() method, set the value of data to SongResource::collection($this->collection) instead of having $this->collection.

Our toArray() method should now look like this:

public function toArray($request)
{
    return [
        'data' => SongResource::collection($this->collection),
       'meta' => ['song_count' => $this->collection->count()]
    ];
}

You can verify we get the correct data in the response by visiting the /api/songs URL again.

What if one wants to add metadata to a single resource and not a collection? Luckily, the JsonResource class comes with an additional() method which lets you specify any additional data you’d like to be part of the response when working with a resource:

Route::get('/songs/{song}', function(Song $song) {
    return (new SongResource(Song::find(1)))->additional([
        'meta' => [
            'anything' => 'Some Value'
        ]
    ]);
})

In this case, the response would look somewhat like this:

{
  data: {
    id: 1,
    title: "Mouse.",
    rating: 3
  },
  meta: {
    anything: "Some Value"
  }
}

###

Including Relationships

In this project, we only have two models, Album and Song. The current relationship is a one-to-many relationship, meaning an album has many songs and a song belongs to an album.

Making an album be part of a song’s response is pretty straightforward. Let’s update the toArray() method inside the SongResource to take note of the album:

class SongResource extends JsonResource
{
    public function toArray($request)
    {
        return [
            // other attributes
            'album' => $this->album
        ];
    }
}

If we want to be more specific in terms of what album attributes should be present in the response, we can create an AlbumResource similar to what we did with songs.

To create the AlbumResource we run:

php artisan make:resource AlbumResource

Once the resource class has been created, we then specify the attributes we want to be included the response.

class AlbumResource extends JsonResource
{
    public function toArray($request)
    {
        return [
            'title' => $this->title
        ];
    }
}

And now inside the SongResource class, instead of doing 'album' => $this->album, we can make use of the AlbumResource class we just created.

class SongResource extends JsonResource
{
    public function toArray($request)
    {
        return [
            // other attributes
            'album' => new AlbumResource($this->album)
        ];
    }
}

If we visit the /api/songs URL again, you’ll notice an album will be part of the response. The only problem with this approach is that it brings up the N + 1 query problem.

For demonstration purposes, add the snippet below inside the api/routes file:

# routes/api.php

[...]
DB::listen(function($query) {
    var_dump($query->sql);
});

Visit the /api/songs URL again. Notice that for each song, we make an extra query to retrieve the album’s details? This can be avoided by eager loading relationships. In our case, update the code inside the /api/songs route closure to:

return new SongsCollection(Song::with('album')->get());

Reload the page again and you’ll notice the number of queries has reduced. Comment out the DB::listen snippet since we don’t need that anymore.

Conditionals When Working With Resources

Every now and then, we might have a conditional determining the type of response that should be returned.

One approach we could take is introducing if statements inside our toArray() method. The good news is we don’t have to do that as there is a ConditionallyLoadsAttributes trait required inside the JsonResource class that has a handful of methods for handling conditionals. Just to mention a few, we have the when(), whenLoaded() and mergeWhen() methods.

We’ll only brush through a few of these methods, but the documentation is quite comprehensive.

####

The whenLoaded() method

This method prevents data that has not been eager loaded from being loaded when retrieving related models thereby preventing the (N+1) query problem.

Still working with the Album resource as a point of reference (an album has many songs):

public function toArray($request)
{
    return [
        // other attributes
        'songs' => SongResource::collection($this->whenLoaded($this->songs))
    ];
}

In the case where we are not eager loading songs when retrieving an album, we’ll end up with an empty songs collection.

The mergeWhen() Method

Instead of having an if statement that dictates whether some attribute and its value should be part of the response, we can use the mergeWhen() method which takes in the condition to evaluate as the first argument and an array containing key-value pair that is meant to be part of the response if the condition evaluates to true:

public function toArray($request)
{
    return [
        // other attributes
        'songs' => SongResource::collection($this->whenLoaded($this->songs)),
        this->mergeWhen($this->songs->count > 10, ['new_attribute' => 'attribute value'])
    ];
}

This looks cleaner and more elegant instead of having if statements wrapping the entire return block.

Unit Testing API Resources

Now that we’ve learnt how to transform our responses, how do we actually verify that the response we get back is what we specified in our resource classes?

Here, we’ll write tests verifying the response contains the correct data as well making sure eloquent relationships are still maintained.

Let’s create the test:

php artisan make:test SongResourceTest --unit

Notice I passed the --unit flag when generating the test to tell Laravel this should be a unit test.

Let’s start by writing the test to make sure our response from the SongResource class contains the correct data:

[...]
use AppHttpResourcesSongResource;
use AppHttpResourcesAlbumResource;
[...]
class SongResourceTest extends TestCase
{
    use RefreshDatabase;
    public function testCorrectDataIsReturnedInResponse()
    {
        $resource = (new SongResource($song = factory('AppSong')->create()))->jsonSerialize();
    }
}

Here, we first create a song resource then call jsonSerialize() on the SongResource to transform the resource into JSON format, as that’s what should be sent to our front-end ideally.

And since we already know the song attributes that should be part of the response, we can now make our assertion:

$this->assertArraySubset([
    'title' => $song->title,
    'rating' => $song->rating
], $resource);

I only matched against two attributes and their corresponding values to keep things simple but you can list as many attributes as you would like.

What about making sure our model relationships are preserved even after converting our models to resources?

public function testSongHasAlbumRelationship()
{
    $resource = (new SongResource($song = factory('AppSong')->create(["album_id" => factory('AppAlbum')->create(['id' => 1])])))->jsonSerialize();
}

Here, we create a song with an album_id of 1 then pass the song on to the SongResource class before finally transforming the resource into JSON format.

To verify that the song-album relationship is still maintained, we make an assertion on the album attribute of the $resource we just created. Like so:

$this->assertInstanceOf(AlbumResource::class, $resource["album"]);

Note, however, if we did $this->assertInstanceOf(Album::class, $resource["album"]) our test would fail since we are transforming the album instance into a resource inside the SongResource class.

As a recap, we first create a model instance, pass the instance to the resource class, convert the resource into JSON format before finally making the assertions. I hope this helps.

Recap

Congratulations if you have managed to get to this point. We’ve looked at what Laravel API resources are, how to create them as well as how to test out various JSON responses. If you are the curious type, you can peep inside the JsonResource class and see all the methods that are available to us.

Do check the official docs to learn more about API resources. The complete code for this tutorial is available on GitHub.

Source:: scotch.io

Authenticate a Node ES6 API with JSON Web Tokens

By Elizabeth Mabishi

User created successfully

In this guide, we’ll be implementing token based authentication in our own node.js A.P.I. using JSON web tokens.

To keep this short and relatively sweet, if you’d like to read about what tokens are and why you should consider using them, have a look at this article here. To catch up on what JSON web tokens are, have a look here.

Now that we have all that out of the way, let’s get started.

Plan of attack

We’ll begin by:

  1. Setting up our development environment and initializing our express server.
  2. Creating our first basic route and controller.
  3. Fleshing out our routes and controllers to add users and login users.
  4. Creating a route and controller that will handle getting all users.

Finally, we’ll

  1. Add middleware to protect our get users route by requiring a user to be an admin and to have a valid token.
  2. Validate that only an admin with a token can access the protected route.

Sounds exciting? Let’s get to it then.

Setup

Before we get started in earnest, we’ll need to have a few things taken care of.

Folder structure

Here’s what our folder structure will look like:


├── config.js
├── controllers
│   └── users.js
├── index.js
├── models
│   └── users.js
├── routes
│   ├── index.js
│   └── users.js
├── utils.js

Quickly create it using the following commands:

>> mkdir -p jwt-node-auth/{controllers/users.js,models/users.js,routes/index.js,routes/users.js}
>> touch utils.js && touch config.js && touch index.js

Prerequisites & Dependencies

The only global install we’ll need is node.js so make sure you have that installed. After that, let’s install our local project dependencies.

Run the following command to initialize our package.json file.

npm init --yes

Install all our dependencies by running:

>> npm install express body-parser bcrypt dotenv jsonwebtoken mongoose  --save
>> npm install morgan nodemon  --save-dev

Why these dependencies?

Dependencies
  1. body-parser: This will add all the information we pass to the API to the request.body object.
  2. bcrypt: We’ll use this to hash our passwords before we save them our database.
  3. dotenv: We’ll use this to load all the environment variables we keep secret in our .env file.
  4. jsonwebtoken: This will be used to sign and verify JSON web tokens.
  5. mongoose: We’ll use this to interface with our mongo database.

Development dependencies

  1. morgan: This will log all the requests we make to the console whilst in our development environment.
  2. nodemon: We’ll use this to restart our server automatically whenever we make changes to our files.

JWT_SECRET=addjsonwebtokensecretherelikeQuiscustodietipsoscustodes
MONGO_LOCAL_CONN_URL=addmongoconnectionurlhere
MONGO_DB_NAME=addmongodbnamehere

Let’s set up our server.

Server initialization

Add the following line to your package.json file.

package.json

"scripts": {
    "dev": "NODE_ENV=development nodemon index.js"
  },

We’ll now start our server with the npm run dev command.

Every time we do this, development is automatically set as a value for the NODE_ENV key in our process object.
The command nodemon index.js will allow nodemon to restart our server every time we make changes in our folder structure.

Let’s define the port we’ll have our server listen to in the config file.

config.js

module.exports = {
  development: {
    port: process.env.PORT || 3000
  }
}

Then set up our server like this:

index.js

const express = require('express'); 
const logger = require('morgan');
const bodyParser = require('body-parser');

const app = express();
const router = express.Router();

const environment = process.env.NODE_ENV; // development
const stage = require('./config')[environment];

app.use(bodyParser.json());
app.use(bodyParser.urlencoded({
  extended: true
}));

if (environment !== 'production') {
  app.use(logger('dev'));
}

app.use('/api/v1', (req, res, next) => {
  res.send('Hello');
  next();
});

app.listen(`${stage.port}`, () => {
  console.log(`Server now listening at localhost:${stage.port}`);
});

module.exports = app;

Run npm run dev at the root of the project and make sure the word Hello is logged when you access the uri localhost:3000/api/v1.

Now that we’re all set up, let’s move on to bootstrapping our add user functionality.

Developing our add user functionality

Let’s modify our server to accept our routing function as middleware that will be triggered on all our routes.

index.js

const routes = require('./routes/index.js');

app.use('/api/v1', routes(router));

controllers/users.js

module.exports = {
  add: (req, res) => {
    return;
  }
}

Route setup

routes/index.js

const users = require('./users');

module.exports = (router) => {
  users(router);
  return router;
};

Here, for the sake of modularity, we pass the router from our server.js file to the router that will handle all functionality related to our users.

routes/users.js

const controller = require('../controllers/users');

module.exports = (router) => {
  router.route('/users')
    .post(controller.add);
};

All we’re doing here is passing our add controller to our router. It’ll be triggered when we make a POST request to the /users route.

Next, let’s work on defining our users model.

Developing the User Model

models/users.js

const mongoose = require('mongoose');
const bcrypt = require('bcrypt');

const environment = process.env.NODE_ENV;
const stage = require('./config')[environment];

// schema maps to a collection
const Schema = mongoose.Schema;

const userSchema = new Schema({
  name: {
    type: 'String',
    required: true,
    trim: true,
    unique: true
  },
  password: {
    type: 'String',
    required: true,
    trim: true
  }
});

module.exports = mongoose.model('User', userSchema);

To add users to our collection, we’ll require that they give us a name and password string.

Hashing users passwords

As stated before, we’ll use bcrypt to hash our users passwords before we store them.

Let’s add a line to our config file to specify how many times we want to salt our passwords.

config.js

 module.exports = {
  development: {
    port: process.env.PORT || 3000,
    saltingRounds: 100
  }
}

We’ll use the mongoose pre hook on save to make sure our passwords are hashed before we save them.
Add the following above the module.exports = mongoose.model('User', userSchema); line.

models/users.js

// encrypt password before save
userSchema.pre('save', function(next) {
  const user = this;
  if(!user.isModified || !user.isNew) { // don't rehash if it's an old user
    next();
  } else {
    bcrypt.hash(user.password, stage.saltingRounds, function(err, hash) {
      if (err) {
        console.log('Error hashing password for user', user.name);
        next(err);
      } else {
        user.password = hash;
        next();
      }
    });
  }
});

Now, let’s modify our add users controller to handle adding users after being handed a name and password

controllers/users.js

const mongoose = require('mongoose');
const User = require('../models/users');

const connUri = process.env.MONGO_LOCAL_CONN_URL;

module.exports = {
  add: (req, res) => {
    mongoose.connect(connUri, { useNewUrlParser : true }, (err) => {
      let result = {};
      let status = 201;
      if (!err) {
        const { name, password } = req.body;
        const user = new User({ name, password }); // document = instance of a model
        // TODO: We can hash the password here before we insert instead of in the model
        user.save((err, user) => {
          if (!err) {
            result.status = status;
            result.result = user;
          } else {
            status = 500;
            result.status = status;
            result.error = err;
          }
          res.status(status).send(result);
        });
      } else {
        status = 500;
        result.status = status;
        result.error = err;
        res.status(status).send(result);
      }
    });
  },
}

In the above code, we connect to our mongodb database then access the name and password provided in the request by destructuring those properties from the request.body object. Remember, we can do this because of our bodyparser middleware.

Next we create a new user document by calling new on our model then in the same step add the name and password we got from the request to the new document.

We could easily have done this instead.

const name = req.body.name;
const password = req.body.password;

let user = new User();
user.name = name;
user.password = password;

user.save((err, user) => { ... }

Does that seem clearer?

We exploited mongoose’s pre hook and hashed our password in our users model but we could just as well have hashed it in our controller before we called user.save

Finally, we pass a callback function into user.save that will handle our errors and pass the user back to us in our server response. We attach a handy status property in our response to let us know if the result was successful or not.

Testing with Postman

I’m using Postman to test out my API functionality but you can use any request library or application you like. Heck, you can even use curl if you’re a console purist. Cue the XBox and Playstation fan boys and fan girls. Tada!

As you can see below, we can now create users by making POST requests to the /api/v1/users endpoint.

What’s that strange string as the value under the password key? Well, that’s our password in hash form.
We store it this way because it’s safer. Hashes are ridiculously difficult to reverse.

Edit out the pre save hashing hook and see what happens. Don’t forget to put it back though. Perish the thought!

We’ll see how to verify that a user is who they say they are, using the password they give us later when we work on the /login route.

Here’s what happens when we try to create a user without specifying a password.

No password error

Here’s what happens when we try to duplicate a user.

Duplicate user error

Developing our log in user functionality

routes/users.js

const controller = require('../controllers/users');

module.exports = (router) => {
  router.route('/users')
    .post(controller.add);

  router.route('/login')
    .post(controller.login)
};

Add the following import statement to the users controller.

const bcrypt = require('bcrypt');

Then, let’s add the login controller that will handle our requests to the /login route.

login: (req, res) => {
    const { name, password } = req.body;

    mongoose.connect(connUri, { useNewUrlParser: true }, (err) => {
      let result = {};
      let status = 200;
      if(!err) {
        User.findOne({name}, (err, user) => {
          if (!err && user) {
            // We could compare passwords in our model instead of below
            bcrypt.compare(password, user.password).then(match => {
              if (match) {
                result.status = status;
                result.result = user;
              } else {
                status = 401;
                result.status = status;
                result.error = 'Authentication error';
              }
              res.status(status).send(result);
            }).catch(err => {
              status = 500;
              result.status = status;
              result.error = err;
              res.status(status).send(result);
            });
          } else {
            status = 404;
            result.status = status;
            result.error = err;
            res.status(status).send(result);
          }
        });
      } else {
        status = 500;
        result.status = status;
        result.error = err;
        res.status(status).send(result);
      }
    });
  }

Above, we query our collection to find the user by their name. If we find them, we use bcrypt to compare the hash generated using the password they’ve given us and the hash that we’d previously stored. If we don’t find them, we send ourselves an error.

Login before token addition

As you can see above, we can now log in our users. As an experiment, try logging in a user without a password or with an incorrect password and see what happens.

Adding Tokens to our authentication process

Let’s add the following import statement to our users controller then work on modifying our login controller to create tokens.

As mentioned before, we’ll use these to protect one of our routes from unauthorized access.

controllers/users.js

const jwt = require('jsonwebtoken');
login: (req, res) => {
    const { name, password } = req.body;

    mongoose.connect(connUri, { useNewUrlParser: true }, (err) => {
      let result = {};
      let status = 200;
      if(!err) {
        User.findOne({name}, (err, user) => {
          if (!err && user) {
            // We could compare passwords in our model instead of below as well
            bcrypt.compare(password, user.password).then(match => {
              if (match) {
                status = 200;
                // Create a token
                const payload = { user: user.name };
                const options = { expiresIn: '2d', issuer: 'https://scotch.io' };
                const secret = process.env.JWT_SECRET;
                const token = jwt.sign(payload, secret, options);

                // console.log('TOKEN', token);
                result.token = token;
                result.status = status;
                result.result = user;
              } else {
                status = 401;
                result.status = status;
                result.error = `Authentication error`;
              }
              res.status(status).send(result);
            }).catch(err => {
              status = 500;
              result.status = status;
              result.error = err;
              res.status(status).send(result);
            });
          } else {
            status = 404;
            result.status = status;
            result.error = err;
            res.status(status).send(result);
          }
        });
      } else {
        status = 500;
        result.status = status;
        result.error = err;
        res.status(status).send(result);
      }
    });
  }

Once we verify that a user is who they say they are, we create a token and pass it to our server response. If something goes wrong, we pass back an error as the response.

Now, we get a token every time we successfully log in a user. Yaaay!

Got token? Yes.

Express middleware

An express middleware function is a function that gets triggered when a route pattern is matched in our request uri. All middleware have access to the request and response objects and can call the next()function to pass execution onto the subsequent middleware function.

Believe it or not, we’ve written out several already. Don’t believe me? I’ll show you.

Bodyparser and morgan are both middleware that act on all our routes.
When we call the app.use function without the specifying the first parameter, we’re esentially doing this:

app.use(bodyParser.json()); 
// This is equivalent to
app.use('/', bodyParser.json());

if (environment !== 'production') {
  app.use(logger('dev'));
  // and this
  app.use('/', logger('dev'));
}

// Here, we've specified the pattern we'd like to be matched from our request's uri
app.use('/api/v1', (req, res, next) => {
  res.send('Hello');
  // We call next to hand execution over to the next middleware
  next();
});

Hold that thought and for now, let’s create a controller function that will get all users from our users collection.

controllers/users.js

getAll: (req, res) => {
    mongoose.connect(connUri, { useNewUrlParser: true }, (err) => {
      User.find({}, (err, users) => {
        if (!err) {
          res.send(users);
        } else {
          console.log('Error', err);
        }
      });
    });
  }

Add our new controller to our routing function.

routes/users.js

const controller = require('../controllers/users');

module.exports = (router) => {
  router.route('/users')
    .post(controller.add)
    .get(controller.getAll); // This route will be protected shortly

  router.route('/login')
    .post(controller.login);
};

Have you noticed that our controllers are esentially middleware functions passed to our other routing middleware? Above we’ve just added middleware our controller that will handle GET requests made to /users. However, we haven’t protected our route yet.

If we make a GET request to /users, here’s what happens.

Getting all users without protection middleware

But we wouldn’t want just any user to access a list of all our users. So, let’s create an admin user then check if they have a token before we allow access to this functionality.

Admin creation

Now, finally, let’s write out middleware to validate that a user has a valid token (issued by us and not expired) before we allow access to certain routes on our application.

utils.js

const jwt = require('jsonwebtoken');

module.exports = {
  validateToken: (req, res, next) => {
    const authorizationHeaader = req.headers.authorization;
    let result;
    if (authorizationHeaader) {
      const token = req.headers.authorization.split(' ')[1]; // Bearer <token>
      const options = {
        expiresIn: '2d',
        issuer: 'https://scotch.io'
      };
      try {
        // verify makes sure that the token hasn't expired and has been issued by us
        result = jwt.verify(token, process.env.JWT_SECRET, options);

        // Let's pass back the decoded token to the request object
        req.decoded = result;
        // We call next to pass execution to the subsequent middleware
        next();
      } catch (err) {
        // Throw an error just in case anything goes wrong with verification
        throw new Error(err);
      }
    } else {
      result = { 
        error: `Authentication error. Token required.`,
        status: 401
      };
      res.status(401).send(result);
    }
  }
};

Let’s add our function to our router so that it’s called before our getAll controller. If validateToken throws an error, controller.getAll won’t be called. Also, if it sends a response with an error, since we haven’t called next in our else block, getAll won’t be called either.

routes/users.js

const controller = require('../controllers/users');
const validateToken = require('../utils').validateToken;

module.exports = (router) => {
  router.route('/users')
    .post(controller.add)
    .get(validateToken, controller.getAll); // This route is now protected

  router.route('/login')
    .post(controller.login);
};

If we leave it as is, all users with a token will be able to access a list of our users but, we only want admins to do this. Let’s make a few final tweaks to our controller to achieve this.

controllers/users.js

getAll: (req, res) => {
    mongoose.connect(connUri, { useNewUrlParser: true }, (err) => {
      let result = {};
      let status = 200;
      if (!err) {
        const payload = req.decoded;
        // TODO: Log the payload here to verify that it's the same payload
        //  we used when we created the token
        // console.log('PAYLOAD', payload);
        if (payload && payload.user === 'admin') {
          User.find({}, (err, users) => {
            if (!err) {
              result.status = status;
              result.error = err;
              result.result = users;
            } else {
              status = 500;
              result.status = status;
              result.error = err;
            }
            res.status(status).send(result);
          });
        } else {
          status = 401;
          result.status = status;
          result.error = `Authentication error`;
          res.status(status).send(result);
        }
      } else {
        status = 500;
        result.status = status;
        result.error = err;
        res.status(status).send(result);
      }
    });
  }

As you can see below, if we don’t pass a token in our authorization headers, we’re refused access.

No token error

Here’s what happens when we pass an invalid token.

Wrong token error

There it is, our middlware is working as intended. Congratulations!

When we pass the token we got from logging in as our admin, we’re allowed to retrieve our users list.

Right token

Conclusion

We’ve covered a lot in this article. As a recap, we’ve learned:

  1. What express middleware is and its basics.
  2. How to create and use routes and controllers that work as express middleware.
  3. How to create and verify user tokens.
  4. How to protect certain routes in our application with our token middleware.

Feedback

I hope this was helpful. As always, please drop me a line in the comments below if you want to chat, ask me a question or give me some feedback.

Source:: scotch.io

Building Intelligent Apps with MongoDB and Google Cloud - Part 1

By Nolan Gallagher

Data analytics is a perpetual underachiever. Every generation of tools promises us better insight and never quite delivers. So we get stuck re-platforming and re-designing, hoping the next iteration will finally get us to the intelligence utopia. Yet modern applications must provide rich experiences, offer decision support, and continuously learn and adapt to win their users. Analytics and AI are at the heart of these Intelligent Apps.

We decided to build an Intelligent App to demonstrate how easy it is to take advantage of ML and AI cloud services without hiring a team of data scientists. First, we built a simple e-commerce application – MongoDB SwagStore – using React and MongoDB Stitch with MongoDB Atlas on GCP. Stitch saved us hundreds of lines of code and our app was ready in days. But aside from implementing stock replenishment notifications with Stitch Triggers and Twilio, it wasn’t very intelligent… yet.

We enabled our SwagStore with a product recommendation engine. Rather than implementing a recommendation engine from scratch, we used Google Cloud ML to train and tune a TensorFlow model that implements a WALS collaborative filtering algorithm. We then used Google Cloud Endpoints to serve up these personalized recommendations.

When a user authenticates, MongoDB Stitch sends an HTTP GET request to the Google Cloud Endpoint to obtain a list of recommended products.

A Stitch Function updates the recommendations array in the user document with the returned result.

exports = function() {
    //services
    const gcp = context.services.get("GoogleCloudRec");
    const mongodb = context.services.get("mongodb-atlas");
    //my swagstore collection
    const users = mongodb.db("swagstore").collection("users");
    const products = mongodb.db("swagstore").collection("products");

    return users.findOne({user_id: context.user.id})
    .then(user => {
        if(!user.gcpId) {
        return [];
        }
        //URL to GCP cloud endpoint
        const url = `https://jfmlrecengine.appspot.com/recommendation?userId=${user.gcpId}`;
        return gcp.get({ url }).then(response => { 
        console.log("Retrieved Recommendations");
        return EJSON.parse(response.body.text());
        })
        .then(result => {
        // Get the product info for the array of product ids
        return products.find({id: {"$in": result.articles}}, {_id:0, id:1, name:1, image:1}).toArray();
        })
        .then(products => {
        console.log(JSON.stringify(products));
        // Write the products to the user document
        return users.updateOne({"gcpId": user.gcpId}, { $set: { "personalized_recs" : products}})
            .then(() => { return products });
        });
    });
};

So when Jane logs into SwagStore she will see these product recommendations:

And Jasper – different ones:

By using MongoDB Stitch combined with powerful cloud services and APIs you can build a recommendation system like this very quickly and plug it right into you operational app getting your developers and data scientists to work together, operationalize insight, and deliver intelligence to your customers. Give it a try!

Stay tuned for Part 2 where SwagStore becomes even more intelligent with an AI chatbot.

Source:: scotch.io