Quantcast
Channel: Laravel News
Viewing all 1727 articles
Browse latest View live

Optimize Model Queries with Eager Loading

$
0
0

Object Relational mapping (ORM) makes working with databases amazingly simple. While defining database relationships in an object-oriented way makes it easy to query related model data, developers might not pay attention to the underlying database calls.

A standard database optimization for an ORM is eager-loading related data. We will set up some example relationships and then walk through how queries change with and without eager loading. I like to get my hands directly on code and experiment with things, and I hope to illustrate how eager loading works with some examples will further help you understand how to optimize your queries.

Introduction

At a basic level, ORMs “lazy” load related model data. After all, how’s the ORM supposed to know your intention? Perhaps you will never actually use the related model’s data after querying the model. Not optimizing the query is known as a “N+1” issue. When you use objects to represent queries, you might be making queries without even knowing it.

Imagine that you were received 100 objects from the database, and each record had 1 associated model (i.e. belongsTo). Using an ORM would produce 101 queries by default; one query for the original 100 records, and additional query for each record if you accessed the related data on the model object. In pseudo code, let’s say you wanted to list all published authors that have contributed a post. From a collection of posts (each post having one author) you could get a list of author names like so:

$posts = Post::published()->get(); // one query

$authors = array_map(function($post) {
    // Produces a query on the author model
    return $post->author->name;
}, $posts);

We are not telling the model that we need all the authors, so an individual query happens each time we get the author’s name from the individual Post model instances.

Eager Loading

As I mentioned, ORMs “lazy” load associations. If you intend to use the associated model data you can trim that 101 query total to 2 queries using eager loading. You just need to tell the model what you need it to load eagerly.

Here’s an example from the Rails Active Record guide on using eager loading. As you can see, the concept is quite similar to Laravel’s eager loading concept.

# Rails
posts = Post.includes(:author).limit(100)

# Laravel
$posts = Post::with('author')->limit(100)->get();

I find that I receive better understanding by exploring ideas from a wider perspective. The Active Record documentation covers some examples that can further help the idea resonate.

Laravel’s Eloquent ORM

Laravel’s ORM, called Eloquent, makes it trivial to eager load models, and even eagerly loading nested relationships. Let’s build on the Post model example and learn how to work with eager loading in a Laravel project.

We will work with the project setup and then go through some eager loading examples more in depth to wrap up.

Setup

Let’s set up some database migrations, models, and database seeding to experiment with eager loading. If you want to follow along, I am assuming you have access to a database and can go through a basic Laravel installation.

Using the Laravel installer, let’s create the project:

laravel new blog-example

Edit your .env values to match your database or choice.

Next, we will create three models so you can experiment with eager loading nested relationships. The example is simple so we can focus on eager loading, and I’ve omitted things you might use like indexes and foreign key constraints.

php artisan make:model -m Post
php artisan make:model -m Author
php artisan make:model -m Profile

The -m flag creates a migration to go along with the model that you will use to create the table schema.

The data models will have the following associations:

Post -> belongsTo -> Author
Author -> hasMany -> Post
Author -> hasOne -> Profile

Migrations

Let’s create a simple schema for each table; I’ve only provided the up() method because Laravel will generate the down() automatically for new tables. The migration files are in the database/migrations/ folder:

<?php

use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;

class CreatePostsTable extends Migration
{
    /**
     * Run the migrations.
     *
     * @return void
     */
    public function up()
    {
        Schema::create('posts', function (Blueprint $table) {
            $table->increments('id');
            $table->unsignedInteger('author_id');
            $table->string('title');
            $table->text('body');
            $table->timestamps();
        });
    }

    /**
     * Reverse the migrations.
     *
     * @return void
     */
    public function down()
    {
        Schema::dropIfExists('posts');
    }
}
<?php

use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;

class CreateAuthorsTable extends Migration
{
    /**
     * Run the migrations.
     *
     * @return void
     */
    public function up()
    {
        Schema::create('authors', function (Blueprint $table) {
            $table->increments('id');
            $table->string('name');
            $table->text('bio');
            $table->timestamps();
        });
    }

    /**
     * Reverse the migrations.
     *
     * @return void
     */
    public function down()
    {
        Schema::dropIfExists('authors');
    }
}
<?php

use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;

class CreateProfilesTable extends Migration
{
    /**
     * Run the migrations.
     *
     * @return void
     */
    public function up()
    {
        Schema::create('profiles', function (Blueprint $table) {
            $table->increments('id');
            $table->unsignedInteger('author_id');
            $table->date('birthday');
            $table->string('city');
            $table->string('state');
            $table->string('website');
            $table->timestamps();
        });
    }

    /**
     * Reverse the migrations.
     *
     * @return void
     */
    public function down()
    {
        Schema::dropIfExists('profiles');
    }
}

Models

You need to define model associations to experiment more with eager loading. When you ran the php artisan make:model command, it created model files for you.

The first model is app/Post.php:

<?php

namespace App;

use Illuminate\Database\Eloquent\Model;

class Post extends Model
{
    public function author()
    {
        return $this->belongsTo(Author::class);
    }
}

Next, the app\Author.php has two associations:

<?php

namespace App;

use Illuminate\Database\Eloquent\Model;

class Author extends Model
{
    public function profile()
    {
        return $this->hasOne(Profile::class);
    }

    public function posts()
    {
        return $this->hasMany(Post::class);
    }
}

With the models and migrations in place, you can run the migrations and then move on to experimenting with eager loading with some seeded model data.

php artisan migrate
Migration table created successfully.
Migrating: 2014_10_12_000000_create_users_table
Migrated:  2014_10_12_000000_create_users_table
Migrating: 2014_10_12_100000_create_password_resets_table
Migrated:  2014_10_12_100000_create_password_resets_table
Migrating: 2017_08_04_042509_create_posts_table
Migrated:  2017_08_04_042509_create_posts_table
Migrating: 2017_08_04_042516_create_authors_table
Migrated:  2017_08_04_042516_create_authors_table
Migrating: 2017_08_04_044554_create_profiles_table
Migrated:  2017_08_04_044554_create_profiles_table

If you check the database, you should see all the created tables!

Model Factories

To create some fake data that we can run queries against, let’s add a few model factories that we can use to seed the database with test data.

Open the database/factories/ModelFactory.php file and append the following three factories to the file below the existing User factory:

/** @var \Illuminate\Database\Eloquent\Factory $factory */
$factory->define(App\Post::class, function (Faker\Generator $faker) {
    return [
        'title' => $faker->sentence,
        'author_id' => function () {
            return factory(App\Author::class)->create()->id;
        },
        'body' => $faker->paragraphs(rand(3,10), true),
    ];
});

/** @var \Illuminate\Database\Eloquent\Factory $factory */
$factory->define(App\Author::class, function (Faker\Generator $faker) {
    return [
        'name' => $faker->name,
        'bio' => $faker->paragraph,
    ];
});

$factory->define(App\Profile::class, function (Faker\Generator $faker) {
    return [
        'birthday' => $faker->dateTimeBetween('-100 years', '-18 years'),
        'author_id' => function () {
            return factory(App\Author::class)->create()->id;
        },
        'city' => $faker->city,
        'state' => $faker->state,
        'website' => $faker->domainName,
    ];
});

These factories will make it easy to populate a bunch of posts that we can query; we can use them to create associated model data with database seeding.

Open the database/seeds/DatabaseSeeder.php file and add the following to the DatabaseSeeder::run() method:

public function run()
{
    $authors = factory(App\Author::class, 5)->create();
    $authors->each(function ($author) {
        $author
            ->profile()
            ->save(factory(App\Profile::class)->make());
        $author
            ->posts()
            ->saveMany(
                factory(App\Post::class, rand(20,30))->make()
            );
    });
}

You create five authors and then loop through each author and save an associated profile and many posts (between 20 and 30 posts per author).

We are done creating migrations, models, model factories, and database seeds. We can combine it all and re-run our migrations and database seeding in a repeatable way:

php artisan migrate:refresh
php artisan db:seed

You should now have some seeded data that you can play around with in the next section. Note that Laravel 5.5 includes a migrate:fresh command which drops the tables instead of rolling back migrations and then re-applying them.

Experimenting with Eager Loading

We are finally ready to see eager loading in action. In my opinion, the best way to visualize eager loading is logging queries to the storage/logs/laravel.log file.

To log database queries, you can either enable the MySQL log or listen for database calls from Eloquent. To log queries through Eloquent, add the following code to the app/Providers/AppServiceProvider.php boot() method:

namespace App\Providers;

use DB;
use Log;
use Illuminate\Support\ServiceProvider;

class AppServiceProvider extends ServiceProvider
{
    /**
     * Bootstrap any application services.
     *
     * @return void
     */
    public function boot()
    {
        DB::listen(function($query) {
            Log::info(
                $query->sql,
                $query->bindings,
                $query->time
            );
        });
    }

    // ...
}

I like to wrap this listener around a configuration check so that I can toggle query logging on and off. You can also get this information from the Laravel Debugbar.

Let’s see what happens when we don’t load model relations eagerly. Clear out your storage/log/laravel.log file and run the “tinker” command.

php artisan tinker

>>> $posts = App\Post::all();
>>> $posts->map(function ($post) {
...     return $post->author;
... });
>>> ...

If you check your laravel.log file, you should see a bunch of queries to get the associated author:

[2017-08-04 06:21:58] local.INFO: select * from `posts`
[2017-08-04 06:22:06] local.INFO: select * from `authors` where `authors`.`id` = ? limit 1 [1]
[2017-08-04 06:22:06] local.INFO: select * from `authors` where `authors`.`id` = ? limit 1 [1]
[2017-08-04 06:22:06] local.INFO: select * from `authors` where `authors`.`id` = ? limit 1 [1]
....

Empty your laravel.log file again, and this time call with() to eager load the author records:

php artisan tinker

>>> $posts = App\Post::with('author')->get();
>>> $posts->map(function ($post) {
...     return $post->author;
... });
...

This time you should only see two queries in the log file. The first query for all the posts, and the second query for all the associated authors.

[2017-08-04 07:18:02] local.INFO: select * from `posts`
[2017-08-04 07:18:02] local.INFO: select * from `authors` where `authors`.`id` in (?, ?, ?, ?, ?) [1,2,3,4,5]

If you had multiple related associations, you can eager load them with an array:

$posts = App\Post::with(['author', 'comments'])->get();

Nested Eager Loading in Eloquent

Nested eager loading works the same way. In our example, the Author model has one profile. Thus, a query will be executed for each profile.

Empty out the laravel.log file and let’s try it out:

php artisan tinker

>>> $posts = App\Post::with('author')->get();
>>> $posts->map(function ($post) {
...     return $post->author->profile;
... });
...

You will now have seven queries. The first two are eagerly loaded, and then each time we get a new profile a query is required to get the profile data for each author.

With eager loading we can avoid the extra queries in nested relationships. Clear your laravel.log file one last time and run the following:

>>> $posts = App\Post::with('author.profile')->get();
>>> $posts->map(function ($post) {
...     return $post->author->profile;
... });

Now, you should only have 3 queries total:

[2017-08-04 07:27:27] local.INFO: select * from `posts`
[2017-08-04 07:27:27] local.INFO: select * from `authors` where `authors`.`id` in (?, ?, ?, ?, ?) [1,2,3,4,5]
[2017-08-04 07:27:27] local.INFO: select * from `profiles` where `profiles`.`author_id` in (?, ?, ?, ?, ?) [1,2,3,4,5]

Lazy Eager Loading

You might only need to gather associated models based on a conditional. In this case, you can lazily invoke additional queries for related data:

php artisan tinker

>>> $posts = App\Post::all();
...
>>> $posts->load('author.profile');
>>> $posts->first()->author->profile;
...

You should see three queries total, but only if $posts->load() is called.

Conclusion

Hopefully, you learned a bit more about eager loading models and understand how it works on a deeper level. The eager loading documentation is comprehensive, and I hope that the additional hands-on practice helps you feel more confident with optimizing your relationship queries.


Writing Custom Laravel Artisan Commands

$
0
0

I’ve written console commands in many different languages, including Node.js, Golang, PHP, and straight up bash. In my experience, the Symfony console component is one of the best-built console libraries in existence—in any language.

Laravel’s artisan command line interface (CLI) extends Symfony’s Console component, with some added conveniences and shortcuts. Follow along if you want to learn how to create some kick-butt custom commands for your Laravel applications.

Overview

Laravel ships with a bunch of commands that aim to make your life as a developer easier. From generating models, controllers, middleware, test cases, and many other types of files for the framework.

The base Laravel framework Command extends the Symfony Command class.

Without Laravel’s console features, creating a Symfony console project is pretty straightforward:

#!/usr/bin/env php
<?php
// application.php

require __DIR__.'/vendor/autoload.php';

use Symfony\Component\Console\Application;

$application = new Application();

// ... register commands
$application->add(new GenerateAdminCommand());

$application->run();

You would benefit from going through the Symfony console component documentation, specifically creating a command. The Symfony console component handles all the pain of defining your CLI arguments, options, output, questions, prompts, and helpful information.

Laravel is getting base functionality from the console component, and extends a beautiful abstraction layer that makes the building consoles even more convenient.

Combine the Symfony console with the ability to create a shippable phar archive—like composer does—and you have a powerful command line tool at your disposal.

Setup

Now that you have quick intro and background of the console in Laravel let’s walk through creating a custom command for Laravel. We’ll build a console command that runs a health check against your Laravel application every minute to verify uptime.

I am not suggesting you ditch your uptime services, but I am suggesting that artisan makes it super easy to build a quick-and-dirty health monitor straight out of the box that we can use as a concrete example of a custom command.

An uptime checker is just one example of what you can do with your consoles. You can build developer-specific consoles that help developers be more productive in your application and production-ready commands that perform repetitive and automated jobs.

Alright, let’s create a new Laravel project with the composer CLI. You can use the Laravel installer as well, but we’ll use composer.

composer create-project laravel/laravel:~5.4 cli-demo
cd cli-demo/
# only link if you are using Laravel valet
valet link
composer require fabpot/goutte

Do you want to know what the beauty of that composer command was? You just used a project that relies on the Symfony console. I also required the Goutte HTTP client that we will use to verify uptime.

Registering the Command

Now that you have a new project, we will create a custom command and register it with the console. You can do so through a closure in the routes/console.php file, or by registering the command in the app/Console/Kernel.php file’s protected $commands property. Think of the former as a Closure-based route and the latter as a controller.

We will create a custom command class and register it with the Console’s Kernel class. Artisan has a built-in command to create a console class called make:command:

php artisan make:command HealthcheckCommand

This command creates a class in the app/Console/Commands/HealthcheckCommand.php file. If you open the file, you will see the $signature and the $description properties, and a handle() method that is the body of your console command.

Adjust the file to have the following name and description:

<?php

namespace App\Console\Commands;

use Illuminate\Console\Command;

class HealthcheckCommand extends Command
{
    /**
     * The name and signature of the console command.
     *
     * @var string
     */
    protected $signature = 'healthcheck
                            {url : The URL to check}
                            {status=200 : The expected status code}';

    /**
     * The console command description.
     *
     * @var string
     */
    protected $description = 'Runs an HTTP healthcheck to verify the endpoint is available';

    /**
     * Create a new command instance.
     *
     * @return void
     */
    public function __construct()
    {
        parent::__construct();
    }

    /**
     * Execute the console command.
     *
     * @return mixed
     */
    public function handle()
    {
        //
    }
}

Register the command in the app/Console/Kernel.php file:

protected $commands = [
    Commands\HealthcheckCommand::class,
];

If you run php artisan help healthcheck you should see something like the following:

Setting up the HTTP Client Service

You should aim to make your console commands “light” and defer to application services to accomplish your tasks. The artisan CLI has access to the service container to inject services, which will allow us to inject an HTTP client in the constructor of our command from a service.

In the app/Providers/AppServiceProvider.php file, add the following to the register method to create an HTTP service:

// app/Providers/AppServiceProvider.php

public function register()
{
    $this->app->singleton(\Goutte\Client::class, function ($app) {
        $client = new \Goutte\Client();
        $client->setClient(new \GuzzleHttp\Client([
            'timeout' => 20,
            'allow_redirects' => false,
        ]));

        return $client;
    });
}

We set up the Goutte HTTP crawler and set the underlying Guzzle client with a few options. We set a timeout (that you could make configurable) and we don’t want to allow the client to follow redirects. We want to know the real status of an HTTP endpoint.

Next, update the HealthcheckCommand::__construct() method with the service you just defined. When Laravel constructs the console command, the dependency will be resolved out of the service container automatically:

use Goutte\Client;

// ...

/**
 * Create a new command instance.
 *
 * @return void
 */
public function __construct(Client $client)
{
    parent::__construct();

    $this->client = $client;
}

The Health Check Command Body

The last method in the HealthcheckCommand class is the handle() method, which is the body of the command. We will get the {url} argument and status code to check that the URL returns the expected HTTP status c

Let’s flesh out a simple command to verify a healthcheck:

/**
 * Execute the console command.
 *
 * @return mixed
 */
public function handle()
{
    try {
        $url = $this->getUrl();
        $expected = (int) $this->option('status');
        $crawler = $this->client->request('GET', $url);
        $status = $this->client->getResponse()->getStatus();
    } catch (\Exception $e) {
        $this->error("Healthcheck failed for $url with an exception");
        $this->error($e->getMessage());
        return 2;
    }

    if ($status !== $expected) {
        $this->error("Healthcheck failed for $url with a status of '$status' (expected '$expected')");
        return 1;
    }

    $this->info("Healthcheck passed for $url!");

    return 0;
}

private function getUrl()
{
    $url = $this->argument('url');

    if (! filter_var($url, FILTER_VALIDATE_URL)) {
        throw new \Exception("Invalid URL '$url'");
    }

    return $url;
}

First, we validate the URL argument and throw an exception if the URL isn’t valid. Next, we make an HTTP request to the URL and compare the expected status code to the actual response.

You could get even fancier with the HTTP client and crawl the page to verify status by checking for an HTML element, but we just check for an HTTP status code in this example. Feel free to play around with it on your own and expand on the healthcheck.

If an exception happens, we return a different status code for exceptions coming from the HTTP client. Finally, we return a 1 exit code if the HTTP status isn’t valid.

Let’s test out our command. If you recall, I linked my project with valet link:

$ php artisan healthcheck http://cli-demo.dev
Healthcheck passed!

$ php artisan healthcheck http://cli-demo.dev/example
Healthcheck failed with a status of '404' (expected '200')
$ echo $?
1

The healthcheck is working as expected. Note that the second command that fails returns an exit code of 1. In the next section, we’ll learn how to run our command on a schedule, and we will force a failure by shutting down valet.

Running Custom Commands on a Schedule

Now that we have a basic command, we are going to hook it up on a scheduler to monitor the status of an endpoint every minute. If you are new to Laravel, the Console Kernel allows you to run Artisan commands on a schedule with a nice fluent API. The scheduler runs every minute and checks to see if any commands need to run.

Let’s set up this command to run every minute:

protected function schedule(Schedule $schedule)
{
    $schedule->command(
        sprintf('healthcheck %s', url('/'))
    )
        ->everyMinute()
        ->appendOutputTo(storage_path('logs/healthcheck.log'));
}

In the schedule method, we are running the command every minute and sending the output to a storage/logs/healthcheck.log file so we can visually see the results of our commands. Take note that the scheduler has both an appendOutputTo() method and a sendOutputTo() method. The latter will overwrite the output every time the command runs, and the former will continue to append new items.

Before we run this, we need to adjust the URL. By default, the url('/') function will probably return http://localhost unless you’ve updated the .env file already. Let’s do so now so we can fully test out the healthcheck against our app:

# .env file
APP_URL=http://cli-demo.dev

Running the Scheduler Manually

We are going to simulate running the scheduler on a cron that runs every minute with bash. Open a new tab so you can keep it in the foreground and run the following infinite while loop:

while true; do php artisan schedule:run; sleep 60; done

If you are watching the healthcheck.log file, you will start to see output like this every sixty seconds:

tail -f storage/logs/healthcheck.log
Healthcheck passed for http://cli-demo.dev!
Healthcheck passed for http://cli-demo.dev!
Healthcheck passed for http://cli-demo.dev!

If you are following along with Valet, let’s shut it down, so the scheduler fails. Shutting down the web server simulates an application being unreachable:

valet stop
Valet services have been stopped.

# from the healthcheck.log
Healthcheck failed with an exception
cURL error 7: Failed to connect to cli-demo.dev port 80: Connection refused (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)

Next, let’s bring our server back and remove the route so we can simulate an invalid status code.

valet start
Valet services have been started.

Next, comment out the route in routes/web.php:

// Route::get('/', function () {
//     return view('welcome');
// });

If you aren’t running the scheduler, start it back up, and you should see an error message when the scheduler tries to check the status code:

Healthcheck failed for http://cli-demo.dev with a status of '404' (expected '200')

Don’t forget to shut down the infinite scheduler tab with Ctrl + C!

Further Reading

Our command simply outputs the result of the healthcheck, but you could expand upon it by broadcasting a failure to Slack or logging it to the database. On your own, try to set up some notification when the healthcheck fails. Perhaps you can even provide logic that it will only alert if three subsequent fails happen. Get creative!

We covered the basics of running your custom command, but the documentation has additional information we didn’t cover. You can easily do things like prompt users with questions, render tables, and a progress bar, to name a few.

I also recommend that you experiment with the Symfony console component directly. It’s easy to set up your own CLI project with minimal composer dependencies. The documentation provides knowledge that will also apply to your artisan commands, for example, when you need to customize things like hiding a command from command list.

Conclusion

When you need to provide your custom console commands, Laravel’s artisan command provides nice features that make it a breeze to write your own CLI. You have access to the service container and can create command-line versions of your existing services. I’ve built CLI tools for things like helping me debug 3rd party APIs, provide formatted details about a record in the database, and perform cache busting on a CDN.

Learn how to set up Xdebug for PhpStorm and Laravel Valet

$
0
0

I’ve been developing web applications for about 15 years, but somehow Xdebug is still challenging to set up. Follow along and learn how to find Xdebug settings and configure it for local development with PhpStorm.

If you are using PHP-FPM locally, you might have run into an annoying issue that no matter what you do Xdebug won’t work! I might have an answer for you in this video.

Here are my Xdebug settings from the video so you can easily copy and paste them if you want:

[xdebug]
zend_extension="/usr/local/opt/php71-xdebug/xdebug.so"
xdebug.remote_autostart=1
xdebug.default_enable=1
xdebug.remote_port=9001
xdebug.remote_host=127.0.0.1
xdebug.remote_connect_back=1
xdebug.remote_enable=1
xdebug.idekey=PHPSTORM

If you find this video useful, the best thing you can do is follow our YouTube channel and share this article on your favorite social networks!

Watch the Laracon US Keynote by Taylor Otwell

$
0
0

StreamACon just released the first Laracon US conference video and it’s Taylor’s keynote. During this talk, he goes through some of the new Laravel 5.5 features and finishes by announcing Laravel Horizon.

You can watch it now on the StreamACon website and the remaining talk videos will be released in two weeks.

Lead photo by @ninjaparade

Ziggy: A Package for Named Laravel Routes in JavaScript

$
0
0

Ziggy is a package that exposes named Laravel routes in JavaScript. When you update named routes in Laravel, the front end will automatically stay in sync.

What I like about this package is that all you have to do (after installing the package) is use the provided @routes blade directive in your layout:

<!DOCTYPE html>
<html lang="en">
  <!-- ... -->
  <body>
    @routes
    <script src="{{ mix('js/app.js') }}"></script>
  </body>
</html>

The @routes directive defines named routes in a JavaScript variable and provides you with a route() JavaScript helper similar to Laravel’s PHP route helper:

var routeUrl = route('posts.index');
var routeUrl = route('posts.show', {post: 1337});

// Returns results from /posts
return axios.get(route('posts.index'))
    .then((response) => {
        return response.data;
    });

Updating named Laravel routes on the backend will automatically keep your front end requests in sync.

Ziggy was written by Daniel Coulbourne from Tighten Co, a Laravel partner. Daniel Coulbourne explains the background story about why he wrote Ziggy:

My JavaScript is full of axios calls, which are made to hard-coded API endpoint URLs in my Laravel apps. This can be a real pain when an endpoint route needs to be moved to a group with a URL prefix, when a parameter needs to be added, or when any other URL-breaking change needs to be made.

This is why, in Laravel, we have named routes, and why we don’t hard-code URLs in our Blade templates. By abstracting away the need for our consuming code to know the exact (down-to-the-letter) URL for a route, we are able to write more change-resilient code—and avoid a lot of “find and replace.”

We wanted the same kind of protection and convenience in JavaScript that we have inside our Laravel apps. And that’s why we built Ziggy.

Read Daniel’s full post introducing Ziggy; he also mentions other packages similar to Ziggy and some inspirational ideas around the concept. You can find Ziggy’s source code on GitHub.

Bootstrap 4 Releases Its First Beta

$
0
0

Bootstrap CSS has just released its first beta release of the new version 4 codebase.

Two years in the making, we finally have our first beta release of Bootstrap 4. In that time, we’ve broken all the things at least twenty-seven times over with nearly 5,000 commits, 650+ files changed, 67,000 lines added, and 82,000 lines deleted. We also shipped six major alpha releases, a trio of official Themes, and even a job board for good measure. Put simply? It’s about time.

Some of the highlights of V4 include:

Moved from Less to Sass.

Bootstrap now compiles faster than ever thanks to Libsass, and we join an increasingly large community of Sass developers.

Flexbox and an improved grid system.

We’ve moved nearly everything to flexbox, added a new grid tier to better target mobile devices, and completely overhauled our source Sass with better variables, mixins, and now maps, too.

Dropped wells, thumbnails, and panels for cards.

Cards are a brand new component to Bootstrap, but they’ll feel super familiar as they do nearly everything wells, thumbnails, and panels did, only better.

Dropped IE8 and IE9 support, dropped older browser versions, and moved to rem units for component sizing to take advantage of newer CSS support.

Aside from our grid, pixels have been swapped for rems and ems where appropriate to make responsive typography and component sizing even easier. Need support for IE8/IE9, Safari 8-, iOS 8-, etc? Keep using Bootstrap 3.

For complete details see the Bootstrap official announcement.

Manage Laravel Forge Servers from Android with ForgeApp

$
0
0

Whether you’re waiting for a movie to start, camping, or sitting in a boring meeting, you can now manage Laravel forge on your Android device with ForgeApp for Android.

Some of the main features of the app include:

  • Add one or multiple Forge accounts
  • Check Servers and Sites for each account
  • Reboot or Stop Nginx, MySql, and Postgres on your servers.
  • Update configuration files
  • Deploy and check deployment logs

The ForgeApp app uses the familiar material design look and feel:

Laravel Forge Android

Thanks to the release of the Forge API, developers are coming up with some fresh products around the Forge ecosystem. For example, Anvil is a Forge app for iOS users.

Andrés Santibáñez is the creator of ForgeApp. You can find the app for free on the Google Play store.

Embed anywhere CMS (works everywhere) – sponsor

$
0
0

Easily add content management anywhere on your site

Component IO lets you avoid endless back-and-forth work when your team wants to make changes to content on your website. It’s a modular, component-based content management system that is easy to add anywhere, on any new or existing website.

Empower your team to make changes themselves directly from your web page without having to learn a new tool or ask you to update the code every time.

Component IO is built with Vue.js and works with every language, framework & platform and is 100% customizable.

Simple setup

Setup is as simple as copy and paste. You add component tags to your HTML with a single script tag at the bottom and everything just works. For example:

<component key=klaln load=b></component>

<script project="example" src="https://cdn.component.io/v1"></script>

This component adds a content block with a picture of a beach Corgi and could be placed anywhere on your web page. Check it out: https://jsfiddle.net/component/1py4jmyp/

Edit everything

Editing content is easy for non-technical users — they can open a pop-up editor directly from your own web page and make content changes themselves:

There are no new tools they have to learn and no new dashboard they have to navigate. They can make edits themselves and see updates directly on the website.

Components are 100% customizable

You can completely customize the look and functionality of any component by adding custom input fields and custom HTML, CSS, and JavaScript.

This means you can replace anything on your website — content blocks, images, navigation menus, social media links, and more — with modular, editable components.

For the whole team

Whether you work solo, with a team, or with clients, everyone saves time when edits are easy.

You can create and collaborate across multiple projects, and manage invites & editing privileges all from one account.

Built by web developers to make web development simpler

Save time by making your workflow less complicated, and empower your clients and non-technical teammates to make changes on their own.

Component IO is free for small projects and has fair, usage-based pricing at scale: stop spending time on content changes and do something interesting instead.

Set up your project for free at Component IO


Many thanks to Component IO for sponsoring Laravel News this week.


Inbound Email in Laravel

$
0
0

I recently needed the ability to receive emails and process attachments on those emails. I love Mailgun for the sending of transactional email, so when I needed to process incoming mail I started digging into Mailgun closer and realized their powerful inbound routing email features!

Come along and learn how you can set up a webhook to process inbound email and secure it within your Laravel applications. We will even use Laravel Valet (or ngrok directly) to test it out locally!

What is Inbound Routing?

If you’ve ever had to deal with directing email into a web application, you know how painful, error-prone, and high-maintenance a homegrown solution can be.

Mailgun takes the pain out of handing incoming mail, parsing it, and dispatching it. Using it has helped me come up with clever ways to automate around incoming email.

Imagine, if you are building an issue system, you could easily parse the email and associate it back with a ticket in your system. Your users could reply to a support email directly instead of logging into an application to communicate on an issue. Mailgun parses out the email body and attachments, so you could skip saving the email signature and save the attachments to the ticket. GitHub issue emails are a good example of inbound email processing that allows you to send messages to a GitHub issue from email.

The Mailgun incoming routing does an excellent job parsing email messages and delivering them to your webhook in a JSON format. The JSON payload includes attachments, the signature, the email body as a stripped plain-text value, and many other things. The inbound engine allows you to pipe incoming emails through various rules.

Setting up Mailgun

Mailgun provides a generous free tier giving you room to experiment with inbound email. If you want to follow along you, need to have a Mailgun account with DNS configured for inbound mail. You will also need to set up MX DNS records to receive inbound mail.

While I’m not going to walk you through setting up Mailgun for your domain, I will walk you through setting up an example webhook and then run it locally.

One area I was concerned about when I integrated Mailgun inbound routing was the ability to experiment and try it out locally. Having to deploy to test it out would have been clunky and annoying. Fortunately, Laravel Valet makes it trivial to test web hooks locally; I was blown away with how easy it was to use Valet (ngrok really) to walk through this process end-to-end on my local machine.

If you are not using Valet, you can use ngrok directly just the same, and I’ll walk you through that.

Creating a Project

Imagine that your customers are emailing in widget orders and your job is to set up a webhook that takes these orders and processes them. They can send attachments along, and you need to handle those too.

Let’s create a new Laravel project and walk through setting up a secure webhook for widgets. You can create the project with either the Laravel installer or Composer:

# Valet
$ laravel new mgexample

# Composer
$ composer create-project laravel/laravel:5.4 mgexample

# Link Valet
$ cd mgexample
$ valet link

To use Mailgun as the email provider, you need to update MAIL_DRIVER and add the Mailgun domain and secret values from your account:

MAIL_DRIVER=mailgun
MAILGUN_DOMAIN=mg.example.com
MAILGUN_SECRET=horizon

Setting the MAIL_DRIVER isn’t a requirement for handling inbound email. The Mailgun configuration is in config/services.php.

And that’s it for the setup. It’s time to write a Webhook route and quickly try it out!

The Webhook Route

You can configure inbound rules to store an email message temporarily (up to 3 days) and notify a webhook. We are going to define one in the routes/api.php file to define a stateless API route. The route will end up being something like POST http://example.com/api/mailgun/widgets. You can name the route and controller whatever you want.

First, let’s create a controller for this route:

php artisan make:controller MailgunWidgetsController

Next, let’s define the route in routes/api.php:

Route::group([
    'prefix' => 'mailgun',
],function () {
    Route::post('widgets', 'MailgunWidgetsController@store');
});

I am using a route group as I like to collect my Mailgun routes in once place. Each inbound route will eventually use a webhook too, so grouping them makes it easy to apply the secure webhook.

For now, we’ll just log the request so we can see what Mailgun sends in the webhook payload:

<?php

namespace App\Http\Controllers;

use Illuminate\Http\Request;

class MailgunWidgetsController extends Controller
{
    public function store()
    {
        app('log')->debug(request()->all());

        return response()->json(['status' => 'ok']);
    }
}

Note that if you don’t return a 200 status code, the service will retry later until a 200 is received or the number of failed attempts is reached.

That’s all we need to start receiving mail in our application, although we have a bit more setup to do in the Mailgun control panel.

Mailgun Route Setup

Before we try out our endpoint, we need to set up a new route in Mailgun. There are various options and ways to process a message, but we will pick a relatively simple setup.

First, let’s start up Valet sharing so we can use the URL to configure the webhook:

$ valet share

ngrok by @inconshreveable                                                           (Ctrl+C to quit)

Session Status                online
Update                        update available (version 2.2.8, Ctrl-U to update)
Version                       2.1.18
Region                        United States (us)
Web Interface                 http://127.0.0.1:4040
Forwarding                    http://d1fc8c85.ngrok.io -> mgexample.app:80
Forwarding                    https://d1fc8c85.ngrok.io -> mgexample.app:80

If you’re not using Valet, install ngrok and run it:

ngrok http example.dev:80

Next, copy the forwarding URL from ngrok and configure a new route in Mailgun:

We are going to match the widget-orders@mg.example.com recipient. Replace the example email address with your own domain.

Matching the recipient means that any orders sent to the email will trigger the webhook.

Next, the store and notify rule will store the message and send the JSON payload to the webhook defined. You can specify multiple retrieval URLs with a comma!

We check “stop” to prevent further rules from firing. We only have one route, but if you don’t want other routes processing the same message, you need to stop. Think of it like a JavaScript’s Event.stopPropagation() method.

Leave the remaining options the default, and you can add in a description if you want.

Click “Create Route” and navigate to the logs section so you can see incoming messages we are about to send in the Mailgun console.

Running the Webhook with Laravel Valet

With Valet running (or ngrok), we are ready to try it out. All you need to do is create an email and send it to the email address you configured.

After you send the email, head over to the terminal running ngrok and you should see an HTTP request:

HTTP Requests
-------------

POST /api/mailgun/widgets      200 OK

Also, check your storage/logs/laravel.log file, and you should see the JSON payload. I sent an example CSV file which looks like this in the webhook payload:

"attachments": [
    {
        "url": "https://se.api.mailgun.net/v3/domains/mg.example.com/messages/eyJwIjpmYWxzZSwiayI6IjBhYjM5MWE5LTU5YzUtNGJkMS1hMzE5LTBhNjU0ODAwOTY4ZCIsInMiOiIyYWMyN2YxYzc2IiwiYyI6InRhbmtiIn0=/attachments/0",
        "content-type": "text/csv",
        "name": "widget-order.csv",
        "size": 554
    }
]

So Mailgun is storing this attachment on their end, and we can get it through the Mailgun API. Before I show you that, let’s secure the webhook, so we know for certain that the request is genuine.

Securing the Webhook

To secure the webhook, Mailgun sends a timestamp and a token in the JSON POST body. Using our configured secret, we can encode the timestamp and token and then compare that value to the signature key in the JSON payload.

To verify webhooks, you need to use the HMAC algorithm using your API secret as the key and SHA256.

We will create a middleware to check these values are making sure the request is legit. You can also add another layer of security by storing the token in something like Redis and rejecting any subsequent requests with the same token. We won’t cover that, but it’s easy to do.

Let’s create a middleware and get it registered as an API middleware:

$ php artisan make:middleware ValidateMailgunWebhook
Middleware created successfully.

Next, register the middleware in app/Http/Kernel.php:

protected $routeMiddleware = [

    // ...

    'mailgun.webhook' => \App\Http\Middleware\ValidateMailgunWebhook::class,
];

Before we forget, let’s update the route group to use this middleware in routes/api.php:

Route::group([
    'prefix' => 'mailgun',
    'middleware' => ['mailgun.webhook'],
],function () {
    Route::post('widgets', 'MailgunWidgetsController@store');
});

Finally, here’s my implementation of the middleware:

<?php

namespace App\Http\Middleware;

use Closure;
use Illuminate\Http\Response;

class ValidateMailgunWebhook
{
    public function handle($request, Closure $next)
    {
        if (!$request->isMethod('post')) {
            abort(Response::HTTP_FORBIDDEN, 'Only POST requests are allowed.');
        }

        if ($this->verify($request)) {
            return $next($request);
        }

        abort(Response::HTTP_FORBIDDEN);
    }

    protected function buildSignature($request)
    {
        return hash_hmac(
            'sha256',
            sprintf('%s%s', $request->input('timestamp'), $request->input('token')),
            config('services.mailgun.secret')
        );
    }

    protected function verify($request)
    {
        if (abs(time() - $request->input('timestamp')) > 15) {
            return false;
        }

        return $this->buildSignature($request) === $request->input('signature');
    }
}

Let’s break this down a little to explain what’s going on.

First, if the request method is not a POST, a 403 response is sent back. Next, we verify the request, and if the verification succeeds, we allow the request to proceed. Last, we abort by default and send back a 403—we defensively protect the route unless it’s valid.

The verify() method checks the request timestamp and makes sure that the request is less than or equal to fifteen seconds. Verify then compares the request’s signature to our signature built from the timestamp and token.

The buildSignature() method encodes the combined timestamp and token, using the Mailgun secret as the key.

Testing the Secured Route

If you send another email, your middleware should still allow the valid requests to go through. However, if you send a request from the terminal, you will get back a 403 now:

$ curl -I -X POST http://mgexample.dev/api/mailgun/widgets
HTTP/1.1 403 Forbidden
Server: nginx/1.10.3

Controller Examples

To wrap things up, let’s discuss a few tips for the controller. I find that dispatching a job is the best way to handle Mailgun webhooks after verifying that I am happy with the payload. Usually, I want to do some asynchronous processing on my end, so dispatching the payload to a job makes sense.

If your use-case is simple, you don’t have to complicate things though. Perhaps you just need to store a value in the database and do some minimal work—use your best judgment.

Let’s say I want to process a collection of files of a particular type. You might have something like this in your controller:

public function store(Request $request)
{
    app('log')->debug(request()->all());

    $files = collect(json_decode($request->input('attachments'), true))
        ->filter(function ($file) {
            return $file['content-type'] == 'text/csv';
        });

    if ($files->count() === 0) {
        return response()->json([
            'status' => 'error',
            'message' => 'Missing expected CSV attachment'
        ], 406);
    }

    dispatch(new ProcessWidgetFiles($files));

    return response()->json(['status' => 'ok'], 200);
}

Note that if you send back a 406 (Not Acceptable) response code, Mailgun will assume the POST is rejected and not retry. If you send back a 200, the webhook is successful. If the controller returns any other status code, Mailgun will retry on a schedule until it succeeds or ultimately fails after the maximum number of retry attempts.

In the dispatched job, you could use Guzzle to download the files and process them in your job class:

use GuzzleHttp\Client;

$response = (new Client())->get($file['url'], [
    'auth' => ['api', config('services.mailgun.secret')],
]);

// do something with $response->getBody();

The file request uses Guzzle’s auth key to use basic HTTP authentication. Note that these files are only temporarily available for a few days.

You’ve Got Mail

So that’s a whirlwind tour of using Mailgun’s inbound email routing with a webhook in Laravel. I’ve just scratched the surface of what you can do!

Resolving Git Conflicts with Git Mergetool

$
0
0

I’ve honed my workflow for resolving conflicts during git merges and rebases over the years. Along the way, I’ve added the git mergetool command to my toolbelt, which makes me productive while merging routine merge conflicts in git. By default on OS X, git uses vimdiff as the mergetool, but in this video, I am going to show you how to use the bundled Filemerge app to visually merge code conflicts with git on OS X.

Google Chrome Puppeteer for Headless Automation

$
0
0

Google Chrome Puppeteer is a Node library that provides a high-level API for working with headless Chrome:

Puppeteer is a Node library which provides a high-level API to control headless Chrome over the DevTools Protocol. It can also be configured to use full (non-headless) Chrome.

According to the GitHub repository, here are a few examples of how you can use Puppeteer:

  • Generate screenshots and PDFs of pages.
  • Crawl a SPA and generate pre-rendered content (i.e. “SSR”).
  • Scrape content from websites.
  • Automate form submission, UI testing, keyboard input, etc.
  • Create an up-to-date, automated testing environment. Run your tests directly in the latest version of Chrome using the latest JavaScript and browser features.
  • Capture a timeline trace of your site to help diagnose performance issues.

When you install the package, Puppeteer will download a recent version of Chromium, which helps guarantee the API will work out of the box. You can also configure different versions of the browser if needed and run it in full (non-headless) mode.

Here’s an example of the API in action from the readme:

const puppeteer = require('puppeteer');

(async() => {
    const browser = await puppeteer.launch();
    const page = await browser.newPage();
    await page.goto('https://example.com');
    await page.screenshot({path: 'example.png'});

    browser.close();
})();

If you want to use Puppeteer for testing automation, be aware that the API is designed specifically for Chrome.

Google Chrome first shipped a headless environment in Chrome 59 for Linux and Mac (Chrome 60 on Windows), bringing modern web features to the command line without the overhead of the full browser.

Puppeteer requires Node version 7.10 or greater. Read more about Puppeteer on GitHub.

On a related note, the next version of Laravel Dusk will also configure the headless option by default (see DuskTestCase.stub).

Kubernetes at GitHub

$
0
0

Google has been rolling out Kubernetes over the last year on github.com. I find beautiful irony in the fact that GitHub hosts the source code for so many popular open source projects.

Think about it.

GitHub hosts Ruby, Rails, MySql, and many other open-source repositories, and then turns around and uses those technologies to build—GitHub. It’s “dogfooding” to the extreme.

Over the last year or so, GitHub has been experimenting with and integrating Docker containers with Kubernetes. Kubernetes is “is an open-source system for automating deployment, scaling, and management of containerized applications.”

Over the last year, GitHub has gradually evolved the infrastructure that runs the Ruby on Rails application responsible for github.com and api.github.com. We reached a big milestone recently: all web and API requests are served by containers running in Kubernetes clusters deployed on our metal cloud. Moving a critical application to Kubernetes was a fun challenge, and we’re excited to share some of what we’ve learned with you today.

GitHub’s adoption is huge for containerization and Docker movement. Organizations of massive scale are moving critical applications to containers to create a more resilient and reliable infrastructure. It seems as though GitHub’s move to container orchestration has increased the ability for more of a self-service platform with less strain on operations and Site reliability engineers (SREs). One huge benefit is insulating an application from differences between environments.

Previously, it seems as though GitHub relied on Capistrano to deploy code to all frontend application servers. Deployment is accomplished with SSH connections to each server where code is updated and services restarted. If a rollback is needed, Capistrano SSH’s into each server and moves a symlink to a previous version.

In a traditional application server environment, when you need additional application nodes you must provide configuration management tools to provision servers. Once servers are ready, you then deploy code on them through a tool like Capistrano, and then finally move them into place to accept traffic. I know from personal experience that expanding the pool of nodes and rolling back code versions can be cumbersome in a traditional server environment. I don’t have an intimate knowledge of GitHub’s infrastructure and deployment process; I can only speak from my own experience in how I was scaling services before I started using Docker.

Since Docker images produce a single code artifact, rolling back from a bad release is usually much smoother. Kubernetes takes care of rolling back to the desired state. Plus, you can run a bad version in another environment in a repeatable way to further isolate issues.

Now, all web (github.com) and API (api.github.com) requests are served from containers running in Kubernetes on GitHub’s metal cloud.

Read all about the journey on GitHub’s engineering blog.

New Laravel String Helpers in 5.4 and 5.5

$
0
0

New string helpers are finding their way into Laravel leading up to the big 5.5 release, which is planned to drop during Laracon EU 2017.

Here are some of the recent highlights in the String helper class, which has been getting some love lately.

The Str::start() Helper

The Str::start() helper was contributed by Caleb Porzio into the Laravel 5.4 branch. This helper makes sure a string begins with only one instance of a single value.

I think I’ve written the following bit of code at least a hundred times. Let’s say you have an API client baseUrl, and you normalize the URL by removing the trailing slash:

<?php

return [
    'my_api' => [
        'base_url' => rtrim(env('MY_API_BASE_URL'), '/'),
    ],
];

And then when you need to normalize the path to avoid multiple forward slashes, you might do something like the following:

<?php

return config('my_api.base_url') . '/' . ltrim($path, '/')

Now with Str::start() and the accompanying str_start() function, your path will be normalized:

<?php

$path = '//example';

config('my_api.base_url') . str_start($path, '/');

// -> https://my-api.com/example

You can also hear more about this helper in the Twenty Percent Time Podcast episode State Machines.

The Str::before() Helper

The before helper was released into laravel/framework master last month and is exactly the inverse of the str_after helper.

Imagine you wanted to get the first part of an email address:

<?php

str_before('jane@example.com', '@');
// -> jane

This helper will be in the upcoming Laravel 5.5 release.

The Str::after() Helper

The Str::after() helper returns everything after a given value in a string. Sticking with the email example, let’s say we wanted just to grab the hostname from an email:

<?php

str_after('jane@example.com', '@');
// -> example.com

The string game is strong with Caleb Porzio, who contributed the Str::after() helper earlier this year!

Learn about all the Helpers

Laravel has an incredible amount of helpers for things like arrays, strings, and URLs. You should check out the official helpers documentation to learn more. I find a useful helper each time I review them.

GitHub Embedded Code Snippets

$
0
0

GitHub rolled out embedded code snippets this month which should make it easier to share existing code without having to re-create it in a comment.

For a while, GitHub has had the ability to highlight lines with a URL hash. For example, if you want to highlight lines 10-15, you would append #L10-L15 to the URL.

Now when you highlight lines, GitHub provides a menu allowing you to copy the selected lines, copy the permalink, and open a new issue which contains the embedded code.

Another neat trick I discovered while learning about this new feature is the ability to highlight a line while looking at a file on GitHub.com, then using Shift+Click to highlight a selection of the original line and the last line clicked. You can highlight a line by clicking the line number in the gutter of the file. If you keep holding shift and clicking line numbers below the original, the UI will further highlight lines.

I’ve shared highlighted lines manually for some time now, and it has always been a pain to copy the code and paste it in a pull request or issue. Embedded code snippets are a great productivity improvement for me, as I spend a good chunk of time reviewing others’ code and working through internal issues and bugs.

Another benefit of embedded snippets is they link back to the original file and show a nicely formatted code block with contexts such as line numbers and file name, so you know which file is being shared without any explanation on your part.

You can learn more about embedded code snippets on the official GitHub blog post Introducing Embedded Code Snippets.

Quickly Edit Your Theme Colors and Fonts in PhpStorm

$
0
0

PhpStorm has released a new feature that allows you to quickly find and edit the theme style for a given line you want to change. To change the theme style for a given line, put your cursor on the line you want to change, press Double Shift (Search Everywhere), and type Jump to Colors & Fonts.

Most lines have a hierarchy of settings, which you can select from a list after hitting enter:

Once you select the option that you want to modify, a popup window will allow you to tweak the styles associated with the line:

One thing I’ve noticed about custom themes in PhpStorm is that they are not always complete for every language or UI window within the IDE. For example, I commonly see themes start to break down when viewing code in the diff tool, highlighting text, or using the console.

I feel like this feature will help me experiment with more themes that I’ve really wanted to improve upon, but I didn’t really know where to start. I can progressively tweak the theme as I find things instead of staring at the entire theme file, trying to pick out the changes I want.

Besides code highlighting tweaks, this feature can also be used for styling search results, error highlighting, the debugger UI, and the console. These areas are where I usually see themes being incomplete or hard to use. Previously, I’ve reverted otherwise great looking themes because the debugger and search were illegible.

Learn more about this feature on the PhpStorm blog.


Laravel Bash Aliases

$
0
0

Bash aliases are shortcuts added to a file that allows you to reference another command through more memorable words, abbreviations, or characters. For example, if you use Git you may run git status many times throughout the day, so to save yourself time and keystrokes you could alias gs to git status and it’ll automatically expand and call the proper command.

Over the years I’ve seen a lot of unusual aliases and many are unique to the person. Shortcuts that make sense to me, might be confusing and weird to you. That’s what makes these so fun.

To gain some insights into what others are doing I asked the community to share theirs with me and quite a few responded. It surprised me how many are similar in that almost everyone made shortcuts for the Artisan command. Yet everyone has a different shortcut pattern, for example, “a”, “pa”, or “art” for the php artisan command. Another unique one that a few shared is a command called “nah”:

nah='git reset --hard;git clean -df'

That one is really nice and to demonstrate how it works, pretend that you started working on a new feature, maybe added a few new files, and then after lunch, you decide everything you’ve done is wrong. By running the nah command it’ll reset your code back to what is committed, and delete any files that are not known to git. Very convenient and useful!

How To Create Your Own Base Aliases

For those new to creating bash aliases, the process is pretty simple. First, open up the ~/.bashrc file in a text editor that is found in your home directory. Uncomment or add the following lines:

if [ -f ~/.bash_aliases ]; then
. ~/.bash_aliases
fi

This tells it to load a .bash_aliases file, if it exists, so you can put all your aliases in it and make them easier to share and keep up with. Finally, create the ~/.bash_aliases file and add the following as your first alias:

alias art="php artisan"

Save the file and type the following in the terminal:

source ~/.bashrc

From here on you should be able to type art and it’ll auto expand into php artisan. Just remember that every time you modify the bash_aliases file you’ll need to either run that source command or restart Terminal so the changes are picked up.

Laravel Bash Aliases from the Community

Below is a list of all the Laravel community contributions and what they are using.

WaveHack

# Laravel

artisan() {
  if [ -f bin/artisan ]; then
    php bin/artisan "$@"
  else
    php artisan "$@"
  fi
}

alias serve='artisan serve'
alias tinker='artisan tinker'

# Misc PHP

t() {
  if [ -f vendor/bin/phpunit ]; then
    vendor/bin/phpunit "$@"
  else
    phpunit "$@"
  fi
}

bmadigan

nah='git reset --hard;git clean -df'
vm='ssh vagrant@127.0.0.1 -p 2222'

Tainmar

pa='php artisan'

Mohamed Said

alias dracarys="git reset --hard && git clean -df"
alias copyssh="pbcopy < $HOME/.ssh/id_rsa.pub"
alias reloadcli="source $HOME/.zshrc"
alias zshrc="/Applications/Sublime\ Text.app/Contents/SharedSupport/bin/subl ~/.zshrc "
alias shrug="echo '¯\_(ツ)_/¯' | pbcopy";
alias fight="echo '(ง'̀-'́)ง' | pbcopy";

*** This one opens a PR from the current branch
function openpr() {
  br=`git branch | grep "*"`
  repo=$1
  parentBranch=$2

  open -a /Applications/Google\ Chrome.app  https://github.com/${repo/* /}/compare/${parentBranch/* /}...themsaid:${br/* /}\?expand\=1
}

Jeffrey Way

alias gl="git log --graph --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset' --abbrev-commit"
alias wip="git add . && git commit -m 'wip'"
alias nah="git reset --hard && git clean -df"
alias p="phpunit"
alias pf="phpunit --filter "
alias art="php artisan"
alias migrate="php artisan migrate"

Bill Mitchell

alias a="php artisan"
alias pu="vendor/bin/phpunit"
alias puf="vendor/bin/phpunit --filter "
alias pug="vendor/bin/phpunit --group "
alias cdo="composer dump-autoload -o"
alias serve="php artisan serve"

Jesús Amieiro

alias pa='php artisan'
alias par:l='php artisan route:list'
alias pam='php artisan migrate'
alias pam:r='php artisan migrate:refresh'
alias pam:rs='php artisan migrate:refresh --seed'
alias cu='composer update'
alias ci='composer install'
alias cda='composer dump-autoload -o'
alias vu='cd ~/Homestead && vagrant up'
alias vs='vagrant suspend'
alias vssh='vagrant ssh'

Piotr

alias artisan = "php artisan"
alias db-reset="php artisan migrate:reset && php artisan migrate --seed"

freekmurze

alias a="php artisan"

paulredmond

alias _='sudo'
alias art='php artisan'
alias tinker='php artisan tinker'
alias ll="ls -lh"
alias la='ls -lAh'
alias c='composer'
alias iphp='psysh' # repl
alias g='git'
alias gs='git status'
alias d='docker'
alias dc='docker-compose'
alias dm='docker-machine'
alias k='kubectl'
alias publicip='dig +short myip.opendns.com @resolver1.opendns.com'
alias chrome="/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome"

# Show file and folder permissions as octal
# Usage: `octal file.txt` or `octal my/path`
alias octal="stat -f '%A %a %N'"

# Mac conveniences for Linux
alias pbcopy='xclip -selection clipboard'
alias pbpaste='xclip -selection clipboard -o'
if type "xdg-open" &> /dev/null; then
    alias open="xdg-open"
fi

TJ Miller

nah: aliased to git reset --hard && git clean -fd
aa: aliased to php artisan

sebastiaanluca

# Hub (extend git commands)
alias git=hub

# Directories
alias ll='ls -FGlAhp'
alias ..="cd ../"
alias ...="cd ../../"
alias ....="cd ../../../"
alias .....="cd ../../../../"

alias df="df -h"
alias diskusage="df"
alias fu="du -ch"
alias folderusage="fu"
alias tfu="du -sh"
alias totalfolderusage="tfu"

alias finder='open -a 'Finder' .'

# Vagrant
alias vagrantgo="vagrant up && vagrant ssh"
alias vgo="vagrantgo"
alias vhalt="vagrant halt"
alias vreload="vagrant reload && vgo"

# PHP
alias c='composer'
alias cr='composer require'
alias cda='composer dumpautoload'
alias co='composer outdated --direct'
alias update-global-composer='cd ~/.composer && composer update'
alias composer-update-global='update-global-composer'

alias a='php artisan'
alias pa='php artisan'
alias phpa='php artisan'
alias art='php artisan'
alias arti='php artisan'

alias test='vendor/bin/phpunit'

alias y='yarn'
alias yr='yarn run'

# Homestead
alias edithomestead='open -a "Visual Studio Code" ~/Homestead/Homestead.yaml'
alias homesteadedit='edithomestead'
alias dev-homestead='cd ~/Homestead && vgo'
alias homestead-update='cd ~/Homestead && vagrant box update && git pull origin master'
alias update-homestead='homestead-update'

# Various
alias editaliases='open -a "Visual Studio Code" ~/.bash_aliases'
alias showpublickey='cat ~/.ssh/id_ed25519.pub'
alias ip="curl icanhazip.com"
alias localip="ifconfig | grep -Eo 'inet (addr:)?([0-9]*\.){3}[0-9]*' | grep -Eo '([0-9]*\.){3}[0-9]*' | grep -v '127.0.0.1'"
alias copy='rsync -avv --stats --human-readable --itemize-changes --progress --partial'

# Functions
mkcdir ()
{
    mkdir -p -- "$1" &&
    cd -P -- "$1"
}

function homestead() {
    ( cd ~/Homestead && vagrant $* )
}

Alexander Melihov

alias ars="php artisan serve"
alias art="php artisan tinker"

jordonbaade

alias l="php artisan"

Deleu

alias unit='php vendor/phpunit/phpunit/phpunit'

alias unitreport='php -d xdebug.profiler_enable=On vendor/phpunit/phpunit/phpunit --coverage-html=./public/report'

alias laravel-installer='composer create-project --prefer-dist laravel/laravel'

curieuxmurray

alias artisan="php artisan"
alias cclear='php artisan cache:clear'
# now with 5.5
alias fresh="artisan migrate:fresh --seed"

wilburpowery

alias pf="phpunit --filter"
alias artisan="php artisan"
alias tinker="php artisan tinker"

waunakeesoccer1

alias mfs="php artisan migrate:fresh --seed'

Maximize Your Terminal Productivity

$
0
0

This week, we just published some great bash aliases from the community, and I wanted to write more about maximizing your terminal productivity. I’ve coached many developers, and the terminal is often the area I see that many great developers (much better than me) could improve and gain some productivity and better tooling.

I started my development journey on a Windows XP machine, learning PHP 4, XHTML, CSS, and JavaScript. I didn’t even know what bash was, let alone used a terminal at the time. I think the extent of my experience consisted of running ipconfig to check my network settings.

That all changed when I watched the original Ruby on Rails demo video. Within that video, my mind was blown by the MVC pattern, Ruby, and Rails. I also learned about two of my favorite tools of all time: TextMate 1 (moment of silence please) and the Unix shell.

I literally bought a Mac just so I could use TextMate—that’s how good it was at the time. I am still sad that Sublime has surpassed that glorious editor, it was such a pleasure for me to use back in the day. I still think it has a future, but the community around it has shrunk considerably in my opinion. But I digress.

Along with TextMate, I started using the terminal to do things like creating files, copying files from a server, and finding things with ack and grep.

Fast-forward over ten years, and today I probably spend ~ 20-25% of my time in the terminal (it could be more or less). I’ve learned that the more I automate in my terminal workflow, the more productivity I unlock. Creating my dotfiles enabled me to have the same familiar terminal environment across all devices and servers.

No matter which machine I am using, my terminal setup is (nearly) identical on my Ubuntu ultrabook, in Digital Ocean, and on my Mackbook Pro. I learned a ton just by extracting my dotfiles into a project on GitHub that I keep versioned. They’re my tools that I continually refine. I don’t recommend you use mine; you will gain so much experience by versioning your own!

I’m a Hack

While I’d say I know my way around the terminal as good as most, I still consider myself intermediate—I am not an expert. I use oh-my-zsh as the foundation with my tweaks on top.

I like to think of Oh My Zsh as a framework for the terminal. Just like a web framework like Laravel can give you a bunch for free out of the box with good conventions, Oh My Zsh provides good defaults and a bunch of stuff for free.

Install Zsh

I always feel a bit sad when I start collaborating or paring with a developer, and I notice a plain vanilla terminal with no customizations. I don’t shame them by any means, but I can’t help but demonstrate a few things that can help. Ultimately, a stock shell without any customizations just looks sad to me.

Vanilla Bash Terminal

Sad indeed.

I’ve seen some amazing bash setups. Some developers do great just sticking with bash and customizing by hand. More often than not, I spent a bunch of time setting up things like git autocompletion, a customized $PS1 shell variable that allows me to customize the prompt, and navigating around the filesystem feels clunky to me in a stock setup.

If you’re primarily a developer like me, I recommend you give ZSH and Oh My Zsh a try. I came for the sweet themes, but I stayed for the powerful features like plugins and history completion. Let’s see some of these things in action.

My Favorite Terminal Tips and Tricks

The following are some of my favorite tricks and tips that I show people when they want to learn more about improving productivity in the terminal.

As far as terminal applications, you can use the built-in terminal. On Ubuntu, I use the default. On OS X I prefer iTerm2, but Hyper is another cross-platform terminal you can use on Linux, OS X, and Windows.

The CD Path

You might be familiar with $CDPATH. The CD path variable is similar to $PATH, but it’s used for a list of paths that you can change directories (cd command) relatively from any path. You can save a ton of keystrokes navigating to paths you use frequently. Unfortunately, you don’t get tab completion out of the box:

You might notice if you try it that you cannot tab complete on the path. Now check it out in Z shell:

This is how you set your cdpath in .zshrc:

cdpath=(~/Code/valet ~/Code/github)

Once you set the cdpath run source ~/.zshrc you can tab-complete the folders within the paths from any path on the filesystem.

Powerful Path Expansion

If you are still using bash, this feature alone might be enough to sell you on using ZSH. Here’s how path expansion works in ZSH: let’s say I am in a Laravel project and I want to dive into the laravel/framework vendor source from the root of the project:

And now for the same in bash:

Okay, okay, I don’t intend to bash on Bash, but I hope you can see some of the supercharged tools available to you outside of the vanilla terminal experience.

History Completion

If you know the first part of a command you want to run, or if you want to cycle through the variations you’ve run from your history, the history completion in ZSH is powerful. Oh My Zsh provides shared history, so all your terminal sessions (windows and tabs) will share a history.

Let’s say I want to browse through some artisan commands I’ve run in the past:

To expand history, start typing a command you want to complete from history, and then use the up and down arrows to cycle through history entries that match.

Oh My Zsh Plugins

If you use Oh My Zsh, you have access to some great plugins. Here’s my list from my .zshrc file:

plugins=(git cap composer phing rails rake ruby gem symfony2 bundler docker docker-compose laravel5)

Command Search

This tip isn’t exclusive to Bash or ZSH, but you need this one in your toolbox. Hit Ctrl+r, and you can search through commands quickly based on what you type. Continue to hit Ctrl+r to cycle through multiple results. It’s easiest to demonstrate:

Searching History and Executing History

Searching through history is another general tip not related to using a bash shell or ZSH exclusively. The humble history command is a great resource when you can’t recall an awesome command you’ve run in the past. When Ctrl+r fails you, run something like the following:

$ history | grep docker rmi
11214  docker rmi c6864c2e7026

Notice a number appears in the history next to each entry. If you use the exclamation point, you can re-run that entry:

$ !11214
$ docker rmi c6864c2e7026

SSH Config

Most of these tips revolve around terminal commands and navigating around. This tip is more for supercharging your SSH usage. Organizing your SSH in the ~/.ssh/config file can save you tons of keystrokes when you need to copy a file down from a server. Most of my production workloads run on Google’s K8 machines with Docker, but now and then I need to SSH into my personal servers.

Here’s what an example SSH config entry might look like for me:

Host myalias
  Hostname 123.456.38.208
  IdentityFile ~/.ssh/id_rsa
  User myuser
  ForwardAgent yes

When I need to start an SSH session I can do the following:

$ ssh myalias

My agent is automatically forwarded, I use the right user and identity file automatically. It saves a ton of annoyance making sure you are using the right credentials.

My favorite use of this config is to copy a file up or down:

$ scp myalias:~/myfile.txt ~/Downloads/myfile.txt

$ scp ~/Downloads/myfile.txt myalias:~/myfile.txt

If you’ve used scp before, you can appreciate how the alias makes this command easy to type and use correctly. I highly encourage you to read up and try using SSH configs.

Terminal Theme

I think using the same theme is another important part of making the terminal your home on any machine. I like iterm2colorschemes.com is an excellent resource if you are using iTerm. I use Monokai Soda from this project. I’ve also modified my Ubuntu terminal to match the colors from Monokai Soda, so I have the same experience that I keep versioned in my dotfiles.

Having the same colors across platforms is huge for me and reduces some mental overhead adjusting between color schemes.

So Many More Terminal Tricks

There’s so much more I could show you, but these are probably some of my favorite terminal tools and tricks. I hope that they will inspire you to experiment with the terminal more and make it your own. Like any craftsman, get to know your tools, and customize them, so they feel like an irreplaceable hammer that just feels perfect when you wield it.

I’d recommend checking out Oh My Zsh and browse through the plugins. They have a plugin for nearly everything I use. Second, I’d recommend starting your own dotfiles repo on GitHub. You can look at mine as an example of how you might symlink your dotfiles in your $HOME folder.

The following command is how I clone my dotfiles repo on a machine:

git clone git@github.com:paulredmond/dotfiles.git ~/.dotfiles

And then I symlink them like so (after I’ve bootstrapped rbenv and installed rake):

$ rake install

You don’t have to use rake, but I love Ruby, and it was fun to bootstrap my dotfiles with my Rakefile. That’s the point of my dotfiles; it’s a playground for me to experiment with my console tools.

Make sure you don’t add in anything secret. In fact, I sometimes use a local file that I add to the .gitignore so I can have one-off things on a machine. Something like this would work in your ~/.zshrc file:

# Local config ignored by git
if [[ -e $HOME/.zshrc.local ]]
then
    source $HOME/.zshrc.local
fi

All of the Laracon US 2017 Videos are now available

$
0
0

Laracon US ended a few weeks ago and as all the previous Laracon’s it was a fantastic event. StreamACon recorded all the videos for the second year in a row and they have just made available all the videos from each day of the event.

With twenty-one videos, you’ll be able to relive the whole event and catch up on anything you may have missed.

Lead Photo by Jakob Owens on Unsplash

Add Your Own Aliases to Laravel Homestead

$
0
0

Laravel Homestead ships with some aliases that get copied into your Homestead virtual machine when you provision it. These aliases are available inside the virtual machine, and you can also add your aliases as well so that you have the same aliases in Homestead that you might use on your local setup. I’ll quickly show you how you can add your aliases.

After you install Homestead and run bash init.sh, the init script copies the aliases file to ~/Homestead/aliases on OS X.

When you make changes to the aliases file, you need to reprovision the image:

vagrant reload --provision

If you look at the Vagrantfile, you can see how the aliases file gets copied into the Homestead machine. By default, Homestead has some helpful aliases already defined that you can use right away:

alias ..="cd .."
alias ...="cd ../.."

alias h='cd ~'
alias c='clear'
alias art=artisan

alias phpspec='vendor/bin/phpspec'
alias phpunit='vendor/bin/phpunit'
alias serve=serve-laravel

alias xoff='sudo phpdismod -s cli xdebug'
alias xon='sudo phpenmod -s cli xdebug'

For example, you can run “h” to go to the home folder, or run “phpunit” without referencing the full vendor/bin path. Another way I like to achieve something similar is adding vendor/bin to my path:

export $PATH="./vendor/bin:$PATH"

Also, last week we shared some great Laravel community bash aliases that you might consider adding to Homestead!

Laravel 5.5 LTS is Now Released

$
0
0

Version 5.5 of Laravel is now officially released! This release is jam-packed with goodies and improvements–here’s a quick video summarizing the highlight features:

Taylor Otwell recenly described his thoughts on Laravel 5.5 in this tweet:

Laravel 5.5 is the Next LTS Release

Laravel 5.5 is the next long term support (LTS) version of Laravel (the last being 5.1). LTS versions receive bug fixes for two years, and security fixes for three years.

General minor releases receive bug fixes for six months and security fixes for one year.

Whoops Package

You might recall the filp/whoops package from Laravel v4 provided elegant stack traces for your most frustrating debugging moments. The whoops package returns to Laravel 5.5!

Collection Dumping

Another great debugging feature (that I’ve seen previously as a package or macro) is collection dumping methods:

<?php

Song::all()
    ->filter
    ->platinum
    ->dump()
    ->filter(function ($song) {
        return $song->released_on >= \Carbon\Carbon::parse('-10 years');
    })
    ->dd();

Read our collection dumping post for more details.

Exception Rendering

Exceptions now can render a response if they define a public “response” method. Typically, in earlier versions of Laravel you might add a check to the App\Exceptions\Handler::render() method, and conditionally send back a response based on the exception type.

In 5.5, you can just throw the exception, and it can respond without additional logic in your handler:

<?php

// throw new TerribleSongException($song) in a controller...

namespace App\Exceptions;

use App\Song;

class TerribleSongException extends \Exception
{
    /**
     * @var \App\Song
     */
    protected $song;

    public function __construct(Song $song)
    {
        $this->song = $song;
    }

    /**
     * @param \Illuminate\Http\Request $request
     */
    public function render($request)
    {
        return response("The song '{$this->song->title}' by '{$this->song->artist}' is terrible.");
    }
}

You can also implement the Responsable interface in your exception classes, and Laravel will respond automatically.

The Responsable Interface

The Responsable interface is another response addition to laravel that we’ve covered at Laravel news. A class implementing the interface can be returned from a controller method; the router now checks for an instance of Responsable when preparing the response from Illuminate\Routing\Router.

Here’s what an example might look like, leaving the response details to the NewSongResponse object:

public function store(Request $request)
{
    $data = request()->validate([
        'title' => 'required',
        'artist' => 'required',
        'description' => 'required',
        'duration' => 'required|numeric',
        'released_on' => 'required|date_format:Y-m-d',
        'gold' => 'boolean',
        'platinum' => 'boolean',
    ]);

    $song = new Song($data);
    $song->save();

    return new NewSongResponse($song);
}

Here’s what the class might look like implementing the Responsable interface for a new song creation:

<?php

namespace App\Http\Responses;

use App\Song;
use Illuminate\Contracts\Support\Responsable;

class NewSongResponse implements Responsable
{
    /**
     * @var \App\Song
     */
    protected $song;

    /**
     * @param \App\Song $song
     */
    public function __construct(Song $song)
    {
       $this->song = $song;
    }

    public function toResponse($request)
    {
        if ($request->wantsJson()) {
            return response()
                ->json($this->song)
                ->header('Location', route('songs.show', $this->song))
                ->setStatusCode(201);
        }

        return redirect()
            ->route('songs.show', $this->song);
    }
}

In this simple example, you could automatically respond with JSON if you make a request via AJAX, and by default response with a redirect the songs.show route.

Request Validation Method

In past versions of Laravel you would pass the request instance to the $this->validate() method in a controller:

$this->validate(request(), [...]);

Now, you can just call validate on the request object:

$data = request()->validate([
    'title' => 'required',
    'artist' => 'required',
    'description' => 'required',
    'duration' => 'required|numeric',
    'released_on' => 'required|date_format:Y-m-d',
    'gold' => 'boolean',
    'platinum' => 'boolean',
]);

Another nice benefit from this style of calling validation is that the return value acts like Request::only(), returning only the keys provided in the validation call. Returning only the validated keys is an excellent convention to use, avoiding Request::all().

Custom Validation Rule Objects and Closures

My favorite feature in Laravel 5.5 is hands-down the new custom validation rule objects and closures. Creating a custom rule object is an excellent alternative to creating custom rules with Validator::extend (which you can still use), because it’s more clear where the rule logic is located at a glance. A validation rule object might look like this:

<?php

namespace App\Rules;

use Illuminate\Contracts\Validation\Rule;

class CowbellValidationRule implements Rule
{
    public function passes($attribute, $value)
    {
        return $value > 10;
    }

    public function message()
    {
        return ':attribute needs more cowbell!';
    }
}

An example of using this validation rule looks like the following:

<?php

request()->validate([
    'cowbells' => [new CowbellValidationRule],
    'more_cowbells' => [function ($attribute, $value, $fail) {
        if ($value <= 10) {
            $fail(':attribute needs more cowbell!');
        }
    }]
]);

The closure style takes the attribute and value, and a fail parameter that you call if the validation rule should fail. The closure is a nice way to experiment with custom validation before you extract it to a dedicated rule object, or for one-off custom validation needs.

To create custom validation rule objects, you can use the new make:rule command:

$ php artisan make:rule MyCustomRule

We have a dedicated post to custom validation rules here on Laravel News, be sure to check it out!

Auth and Guest Blade Directives

We have written about Blade::if() directives in 5.5. A few new conditional directives in 5.5 are @auth and @guest.

Typically you might use something like the following to check for an authenticated user in Blade:

@if(auth()->check())
    {{ -- authenticated --}}
@endif

@if(auth()->guest())

You can now use the following directives to achieve the same thing:

@auth
    Welcome {{ user()->name }}!
@endauth

@guest
    Welcome Guest!
@endguest

Frontend Presets

When you are starting a new project, Laravel 5.5 provides Vue.js scaffolding by default. In Laravel 5.5 you can now pick from a few presets and remove all frontend scaffolding with the “preset” Artisan command in Laravel 5.5.

If you look at the help, you can see that it allows you to pick “none,” “bootstrap,” “vue”, or “react”:

php artisan help preset
Usage:
  preset <type>

Arguments:
  type    The preset type (none, bootstrap, vue, react)

# Use react
$ php artisan preset react

# Clear scaffolding
$ php artisan preset none

Separate Factory Files

Factory files were previously defined in one ModelFactory.php file. Now, you create different files for each model. You can create a factory file when you are creating a new model:

$ php artisan make:model -fm Post

# Or you can create a controller, migration, and factory
$ php artisan make:model --all

You can also create a factory file directly with “make:factory”:

$ php artisan make:factory --model=Example ExampleFactory

The migrate:fresh Migration Command

The new “migrate:fresh” migration command 5.5 is a nice addition to creating a clean database in development. The migrate:fresh command drops all the database tables and then runs the migrations.

You might be familiar with the existing migrate:refresh command, which rolls back migrations and then reruns them. Usually in development you want to just drop the tables, getting a fresh database, and running migrations.

The RefreshDatabase Trait

On the testing front, the RefreshDatabase trait is the new way to migrate databases during tests. This new trait takes the most optimal approach to migrating your test database depending on if you are using an in-memory database or a traditional database. The DatabaseTransactions and DatabaseMigrations traits are still available in 5.5, allowing you to upgrade without using the new RefreshDatabase trait.

The withoutExceptionHandling() method

The base test case inherits a method withoutExceptionHandling(), which allows you to disable exception handling for a test. Disabling exception handling allows you to catch the exception in your test and assert the exception instead of the exception handler responding. It’s also a useful debugging tool when your test is doing something you don’t expect, and you want to see the actual exception.

Automatic Package Discovery

The last feature we are going to look at is automatic package discovery. While Laravel packages aren’t usually hard to install, the package detection feature means you don’t have to set up providers or aliases. You can disable auto-discovery for specific packages.

Learn more about this feature from the Taylor Otwell’s article on this feature, and on our post.

Learning More About Laravel 5.5

Make sure to check out our Laravel 5.5 content for a more in-depth look. Laracasts has a complete series available on all of these features. Also, check out the official documentation, release notes, and the upgrade guide.

Viewing all 1727 articles
Browse latest View live