The Wix Code Database and Data Modeling

This article was created in partnership with Wix. Thank you for supporting the partners who make SitePoint possible.

One of the cool features of Wix Code is the ability to separate your site’s design and layout from its content. This means you can create and maintain your information in a database and then have your pages dynamically retrieve and display this information in whatever way you like.

Let’s take an in-depth look at what you can do with the Wix Code database, including the types of information you can store, ways you can manipulate data with code, and how you can dynamically display the information on your site.

Throughout this article, we’ll use a simplified example of an art school that stores and displays information about its courses and teachers.

The Wix Code Database

Like all databases, the Wix Code database is made up of individual tables, which we call collections. In our example of the art school (see image below), we have two collections, one each for the courses and teachers.

art school

You can create as many collections as you need and populate them with a near unending amount of data. A robust permissions model means you have complete control over who can access your information and what they can do with it.

You can work directly in your Live data, which is the information your visitors see when they view your pages. You can also work with Sandbox data, so you can try stuff out without affecting your live site. You can sync between them at any time.

Populating Collections

You have several options for populating your collections. You can manually enter data directly in the Wix Content Manager, either to your Live data or your Sandbox data.

If you’re an Excel ace, you can do all the work in Excel (or whatever spreadsheet program you prefer), save your sheet as a CSV file, and then import it into the Wix Code database. In fact, you can create your entire collection this way, schema and all. You can import to your Live data or your Sandbox data.

You can also export your Wix data to CSV files. If you make sure to include the built-in ID system field, you will be able to modify your content in your spreadsheet and then re-import it into your Wix Code database so that each record, or what we call item, is updated.

A third option is to build a form to capture user input and store it in your database.

Using External Databases

If you already have a database somewhere, you might be thinking that you don’t want to recreate it in Wix. The good news is that you don’t have to. As long as your database exposes an API, you can access it from your Wix site.

For simple applications, you can use the wix-fetch module—an implementation of the standard JavaScript Fetch API—to access your external database with an HTTP request and use that data in your Wix site’s pages.

You can also pair the wix-fetch module with another Wix module, wix-router, that lets you control the routing of incoming requests. Using the functionality provided by both of these modules, you can create SEO-friendly dynamic pages that show different data depending on the URLs used to reach them.

For example, you can design a single member profile page that can be used by all of your site’s members.

Using wix-router and wix-fetch you can write code that pulls information from incoming requests for the profile page, queries an external database to retrieve the information for the page, and then injects that data into the profile page. You can even add security to your page by using the wix-users module.

So if you create another page for users to update their profile pages, you can check who is trying to access it and only allow users to update their own profiles.

Data Hooks

You can add hooks to actions on your collections using the wix-data API.

For example, in our Teachers collection, we have two separate fields: First name and Last name. To make displaying names on our pages easier, we also want to have one field that has both names together. To do this, we can add a beforeInsert hook to our Teachers collection that hooks into the insert action, reads the contents of the First name and Last name fields, and then concatenates them and populates the Full name field.

Modeling Your Data

Now that we’ve covered the database itself, let’s talk about modeling your data in the Wix Code database.

Collection Schemas

Like all databases, each collection has a schema to define its fields. All standard field types are supported, including text, image, boolean, number, date and time, and rich text.

There is also a field type specifically designed for URLs. It automatically formats the URL into clickable links that you can add to your pages. For example, teachers in your school could supply the URL of their portfolio website, and you could include that link on their dynamic page.

You can also use the document field type to store a wide range of file types. You can allow your users to download files stored in your collections (such as reading lists for each course) or to upload their own files.

ID Fields and Primary Fields

Each collection has an _ID field, which is the primary key for that table. Collections also have a primary field (indicated by a lock icon), which is the display key for each item.

When you create joins using reference fields (see the next section), the values come from the primary field. The reference itself uses the _ID field, of course. If you plan on using reference fields, it’s a good idea to make sure the data you store in the primary field is unique.

ID field

Reference Fields

Reference fields create a connection between collections that is defined in the collection schema itself. This is similar to foreign keys in relational databases.

Each reference field points to a specific collection. The value that is displayed in the reference field in each item in the collection is taken from the value of the primary field of the referenced collection.

Reference fields

In our example, we created a reference field in our Courses collection that points to our Teachers collection so that we can indicate who teaches each class.

The advantage of reference fields is three-fold. First, they help maintain data integrity because their value is taken directly from the referenced collection. Second, they help eliminate data duplication, which we all know is the enemy of good database design. And third, when we create our page layouts, reference fields let us access information in the referenced collection as well as in the main collection we are using. This allows us to create master-detail pages, such as a list of all the courses taught by each teacher.

Creating Pages from Your Content

Of course, storing and maintaining data is nice, but the real point of having a website is displaying content to visitors. So let’s talk about how that works with Wix Code.

Back to our art school example. We have two different types of information: courses and teachers. So you could start by designing a page layout to display all the information about each of the courses. Then you might want to create a master-detail page that lists all of your teachers and the courses they teach.

Continue reading %The Wix Code Database and Data Modeling%


Source: Sitepoint

How to Enable AI with Secure Communications

This article was sponsored by BlackBerry. Thank you for supporting the partners who make SitePoint possible.

Imagine a healthcare platform that designs the perfect treatment plan for a patient based on their medical history. Picture a chatbot that automatically generates legal appeals or resolves customer disputes in minutes. Imagine a virtual assistant that knows your habits, likes, and dislikes, and can suggest activities based on this knowledge.

This technology already exists today, and it is just the beginning. Alongside the Enterprise of Things, we are on the verge of a second revolution. Artificial intelligence will change everything, from how we protect ourselves from cyberattacks to how we go about our daily lives.

And much like the Enterprise of Things, businesses that do not start planning for artificial intelligence now will be left behind – according to Tata Consultancy Services, 84% of businesses believe AI will be essential.

Building A Smarter Enterprise – Artificial Intelligence and App Development

Application development will be the foundation of the move towards artificial intelligence. Businesses that integrate AI into their apps will be able to provide new services and a better, more personalized user experience. They will be able to gain deeper insights into how their customers think and act, and open new revenue streams through those insights.

Moreover, artificial intelligence will power new, transformative interactions between people, machines, and the Internet of Things.

Through AI-driven analysis, businesses will gain a deeper understanding of their market and their staff. Automation will allow workers to respond proactively to customer complaints or security incidents, boost productivity, reduce costs, and minimize employee error. And through machine learning, businesses will be able to make better, more informed decisions than ever before.

In time, people will demand such capabilities. Next-generation apps and services will be expected to not only support human-to-human interactions, but also human-to-machine and machine-to-machine interactions. Just as mobile apps are critical to business success today, artificial intelligence will be critical to success very soon.

Getting Past The Roadblocks to Enterprise AI

Though most businesses acknowledge artificial intelligence’s importance, AI remains elusive. The issue is primarily one of complexity and cost. In a 2017 study by Infosys, 72% of IT professionals found time constraints were a roadblock to AI adoption, while 71% referenced financial limitations. Resistance to change and a lack of clarity around AI’s value were also hindrances.

Even for businesses that manage to overcome those challenges, security remains a core issue. AI apps will routinely deal with sensitive data such as customer information, internal messages, login credentials, usage details, and even intellectual property. Left unencrypted, such services could leak that data into the wrong hands.

Communications Platform as a Service (CPaaS) tools are central to overcoming these challenges. By integrating real-time communications into their apps – and tying that functionality to its AI services – developers allow for better, deeper interactions between AI and user. More importantly, with the right CPaaS solution, they ensure those interactions are kept secure, and that the AI does not leak critical data.

How The BBM Enterprise SDK Makes Your Apps Smarter

Here’s where the BBM Enterprise SDK comes in. A secure CPaaS platform built on BlackBerry’s strength in secure mobile messaging, it gives your developers everything they need to incorporate secure, enterprise-grade messaging functionality in their apps. You can use commonly used identity and user management providers to make development even easier.

More importantly, it offers several features that directly empower artificial intelligence:

  • Embedded voice, video, chat. Enable your users to reach out to anyone around the world and be reached they want, whether for emergency communications, peer-to-peer collaboration, or by receiving personalize support services.
  • Publish/Subscribe Services. Create channels which broadcast to subscribing users. This keeps them updated on all new activity in a collaboration space, whether by another user or from the machine-readable information your application consumes.
  • Chatbots and Routing Services. Provide real-time support for your users via a chatbot which can process their data, activity, and messages. This information is then used to route them to the correct contact.
  • AI-Driven Predictive Analytics. AI algorithms enable behind-the-scenes user empowerment, delivering relevant information to users when they need it. These include location-based alerts or suggested actions based on user behavior.
  • Secure IoT Data Sharing. Eliminate the worry of cached copies or “fingerprints in the cloud” that could compromise privacy while also supporting real-time data sharing across all endpoints – human and machine.

We suggest that you first download the free SDK and familiarize yourself with the BBM Enterprise SDK with Blackberry’s Getting Started Guide.

Now that you’re ready, let’s dive into some examples that can help you get started with your AI journey…

How to Create Data Streams via Whiteboarding

This example shows how you can send arbitrary data in a BBM Enterprise SDK chat to create a shared whiteboard that allows us to do the following:

  • Create new whiteboards with one or more users
  • Share pictures and markup

Continue reading %How to Enable AI with Secure Communications%


Source: Sitepoint

Think Less, Embrace More: Inspiring Desktop Wallpapers For February 2018

Time flies by… The first month of the new year is already behind us, and with February just around the corner, it’s time for some fresh inspiration. So how about some wallpapers to tickle your ideas? Wallpapers that are a bit more distinctive as the ones you usually find out there? Well, we’ve got you covered.
As every month since more than nine years already, artists and designers from across the globe once again fired up their favorite illustration tools and got out their paint brushes and cameras to create charming desktop wallpapers that are sure to breathe some fresh life into your desktop.
Source: Smashing Magazine

Exception Handling in Laravel

In this article, we’re going to explore one of the most important and least discussed features of the Laravel web framework—exception handling. Laravel comes with a built-in exception handler that allows you to report and render exceptions easily and in a friendly manner.

In the first half of the article, we’ll explore the default settings provided by the exception handler. In fact, we’ll go through the default Handler class in the first place to understand how Laravel handles exceptions.

In the second half of the article, we’ll go ahead and see how you could create a custom exception handler that allows you to catch custom exceptions.

Setting Up the Prerequisites

Before we go ahead and dive into the Handler class straight away, let’s have a look at a couple of important configuration parameters related to exceptions.

Go ahead and open the config/app.php file. Let’s have a close look at the following snippet.

As the name suggests, if it’s set to TRUE, it’ll help you to debug errors that are generated by an application. The default value of this variable is set to a value of the APP_DEBUG environment variable in the .env file.

In the development environment, you should set it to TRUE so that you can easily trace errors and fix them. On the other hand, you want to switch it off in the production environment, and it’ll display a generic error page in that case.

In addition to displaying errors, Laravel allows you to log errors in the log file. Let’s have a quick look at the options available for logging. Again, let’s switch to the config/app.php file and have a close look at the following snippet.

As Laravel uses the Monolog PHP library for logging, you should set the above options in the context of that library.

The default log file is located at storage/logs/laravel.log, and it’s sufficient in most cases. On the other hand, the APP_LOG_LEVEL is set to a value that indicates the severity of errors that’ll be logged.

So that was a basic introduction to the configuration options available for exceptions and logging.

Next, let’s have a look at the default Handler class that comes with the default Laravel application. Go ahead and open the app/Exceptions/Handler.php file.

There are two important functions that the handler class is responsible for—reporting and rendering all errors.

Let’s have a close look at the report method.

The report method is used to log errors to the log file. At the same time, it’s also important to note the dontReport property, which lists all types of exceptions that shouldn’t be logged.

Next, let’s bring in the render method.

If the report method is used to log or report errors, the render method is used to render errors on a screen. In fact, this method handles what will be displayed to users when the exception occurs.

The render method also allows you to customize a response for different types of exceptions, as we’ll see in the next section.

Finally, the unauthenticated method handles the AuthenticationException exception that allows you to decide what will be displayed to users in case they’re unauthenticated to access a page they are looking for.

Custom Exception Class

In this section, we’ll create a custom exception class that handles exceptions of the CustomException type. The idea behind creating custom exception classes is to easily manage custom exceptions and render custom responses at the same time.

Go ahead and create a file app/Exceptions/CustomException.php with the following contents.

The important thing to note here is that the CustomException class must extend the core Exception class. For demonstration purposes, we’ll only discuss the render method, but of course you could also customize the report method.

As you can see, we’re redirecting users to the errors.custom error page in our case. In that way, you can implement custom error pages for specific types of exceptions.

Of course, we need to create an associated view file at resources/views/errors/custom.blade.php.

That’s a pretty simple view file that displays an error message, but of course you could design it the way you want it to be.

We also need to make changes in the render method of the app/Exceptions/Handler.php file so that our custom exception class can be invoked. Let’s replace the render method with the following contents in the app/Exceptions/Handler.php file.

As you can see, we are checking the type of an exception in the render method in the first place. If the type of an exception is AppExceptionsCustomException, we call the render method of that class.

So everything is in place now. Next, let’s go ahead and create a controller file at app/Http/Controllers/ExceptionController.php so we can test our custom exception class.

Of course, you need to add an associated route in the routes/web.php as shown in the following snippet.

And with that in place, you can run the http://your-laravel-site.com/exception/index URL to see if it works as expected. It should display the errors.custom view as per our configuration.

So that’s how you are supposed to handle custom exceptions in Laravel. And that brings us to the end of this article—I hope you’ve enjoyed it!

Conclusion

Today, we went through the exception handling feature in Laravel. At the beginning of the article, we explored the basic configuration provided by Laravel in order to render and report exceptions. Further, we had a brief look at the default exception handler class.

In the second half of the article, we prepared a custom exception handler class that demonstrated how you could handle custom exceptions in your application.

For those of you who are either just getting started with Laravel or looking to expand your knowledge, site, or application with extensions, we have a variety of things you can study in Envato Market.

I would love to hear from you in the form of queries and suggestions!


Source: Nettuts Web Development

A Comprehensive Guide To Product Design

(This is a sponsored article.) What is a product? Until recently, the term was used only in relation to something material and often found in a retail store. Nowadays, it is coming to mean digital products as well. Apps and websites are modern products.
When it comes to building great products, design is the most important “feature.” We’ve moved into the stage where product design dominates — it’s what sets companies apart and gives a real edge over competitors.
Source: Smashing Magazine

A GraphQL Primer: The Evolution Of API Design (Part 2)

In Part 1 we looked at how APIs have evolved over the last few decades and how each one gave way to the next. We also talked about some of the particular drawbacks of using REST for mobile client development. In this article, I want to look at where mobile client API design appears to be headed — with a particular emphasis on GraphQL.
There are, of course, lots of people, companies, and projects that have tried to address RESTs shortcomings over the years: HAL, Swagger/OpenAPI, OData JSON API and dozens of other smaller or internal projects have all sought to bring order to the spec-less world of REST.
Source: Smashing Magazine

An Introduction to Gulp.js

Developers spend precious little time coding. Even if we ignore irritating meetings, much of the job involves basic tasks which can sap your working day:

  • generating HTML from templates and content files
  • compressing new and modified images
  • compiling Sass to CSS code
  • removing console and debugger statements from scripts
  • transpiling ES6 to cross-browser–compatible ES5 code
  • code linting and validation
  • concatenating and minifying CSS and JavaScript files
  • deploying files to development, staging and production servers.

Tasks must be repeated every time you make a change. You may start with good intentions, but the most infallible developer will forget to compress an image or two. Over time, pre-production tasks become increasingly arduous and time-consuming; you’ll dread the inevitable content and template changes. It’s mind-numbing, repetitive work. Wouldn’t it be better to spend your time on more profitable jobs?

If so, you need a task runner or build process.

That Sounds Scarily Complicated!

Creating a build process will take time. It’s more complex than performing each task manually, but over the long term, you’ll save hours of effort, reduce human error and save your sanity. Adopt a pragmatic approach:

  • Automate the most frustrating tasks first.
  • Try not to over-complicate your build process. An hour or two is more than enough for the initial setup.
  • Choose task runner software and stick with it for a while. Don’t switch to another option on a whim.

Some of the tools and concepts may be new to you, but take a deep breath and concentrate on one thing at a time.

Task Runners: the Options

Build tools such as GNU Make have been available for decades, but web-specific task runners are a relatively new phenomenon. The first to achieve critical mass was Grunt — a Node.js task runner which used plugins controlled (originally) by a JSON configuration file. Grunt was hugely successful, but there were a number of issues:

  1. Grunt required plugins for basic functionality such as file watching.
  2. Grunt plugins often performed multiple tasks, which made customisation more awkward.
  3. JSON configuration could become unwieldy for all but the most basic tasks.
  4. Tasks could run slowly because Grunt saved files between every processing step.

Many issues were addressed in later editions, but Gulp had already arrived and offered a number of improvements:

  1. Features such as file watching were built in.
  2. Gulp plugins were (mostly) designed to do a single job.
  3. Gulp used JavaScript configuration code that was less verbose, easier to read, simpler to modify, and provided better flexibility.
  4. Gulp was faster because it uses Node.js streams to pass data through a series of piped plugins. Files were only written at the end of the task.

Of course, Gulp itself isn’t perfect, and new task runners such as Broccoli.js, Brunch and webpack have also been competing for developer attention. More recently, npm itself has been touted as a simpler option. All have their pros and cons, but Gulp remains the favorite and is currently used by more than 40% of web developers.

Gulp requires Node.js, but while some JavaScript knowledge is beneficial, developers from all web programming faiths will find it useful.

What About Gulp 4?

This tutorial describes how to use Gulp 3 — the most recent release version at the time of writing. Gulp 4 has been in development for some time but remains a beta product. It’s possible to use or switch to Gulp 4, but I recommend sticking with version 3 until the final release.

Step 1: Install Node.js

Node.js can be downloaded for Windows, macOS and Linux from nodejs.org/download/. There are various options for installing from binaries, package managers and docker images, and full instructions are available.

Note for Windows users: Node.js and Gulp run on Windows, but some plugins may not install or run if they depend on native Linux binaries such as image compression libraries. One option for Windows 10 users is the new bash command-line, which solves many issues.

Once installed, open a command prompt and enter:

node -v

This reveals the version number. You’re about to make heavy use of npm — the Node.js package manager which is used to install modules. Examine its version number:

npm -v

Note for Linux users: Node.js modules can be installed globally so they’re available throughout your system. However, most users will not have permission to write to the global directories unless npm commands are prefixed with sudo. There are a number of options to fix npm permissions and tools such as nvm can help, but I often change the default directory. For example, on Ubuntu/Debian-based platforms:

cd ~
mkdir .node_modules_global
npm config set prefix=$HOME/.node_modules_global
npm install npm -g

Then add the following line to the end of ~/.bashrc:

export PATH="$HOME/.node_modules_global/bin:$PATH"

Finally, update with this:

source ~/.bashrc

Step 2: Install Gulp Globally

Install Gulp command-line interface globally so the gulp command can be run from any project folder:

npm install gulp-cli -g

Verify Gulp has installed with this:

gulp -v

Step 3: Configure Your Project

Note for Node.js projects: you can skip this step if you already have a package.json configuration file.

Presume you have a new or pre-existing project in the folder project1. Navigate to this folder and initialize it with npm:

cd project1
npm init

You’ll be asked a series of questions. Enter a value or hit Return to accept defaults. A package.json file will be created on completion which stores your npm configuration settings.

Note for Git users: Node.js installs modules to a node_modules folder. You should add this to your .gitignore file to ensure they’re not committed to your repository. When deploying the project to another PC, you can run npm install to restore them.

For the remainder of this article, we’ll presume your project folder contains the following sub-folders:

src folder: preprocessed source files

This contains further sub-folders:

  • html – HTML source files and templates
  • images — the original uncompressed images
  • js — multiple preprocessed script files
  • scss — multiple preprocessed Sass .scss files

build folder: compiled/processed files

Gulp will create files and create sub-folders as necessary:

  • html — compiled static HTML files
  • images — compressed images
  • js — a single concatenated and minified JavaScript file
  • css — a single compiled and minified CSS file

Your project will almost certainly be different but this structure is used for the examples below.

Tip: If you’re on a Unix-based system and you just want to follow along with the tutorial, you can recreate the folder structure with the following command:

mkdir -p src/{html,images,js,scss} build/{html,images,js,css}

Step 4: Install Gulp Locally

You can now install Gulp in your project folder using the command:

npm install gulp --save-dev

This installs Gulp as a development dependency and the "devDependencies" section of package.json is updated accordingly. We’ll presume Gulp and all plugins are development dependencies for the remainder of this tutorial.

Alternative Deployment Options

Development dependencies are not installed when the NODE_ENV environment variable is set to production on your operating system. You would normally do this on your live server with the Mac/Linux command:

export NODE_ENV=production

Or on Windows:

set NODE_ENV=production

This tutorial presumes your assets will be compiled to the build folder and committed to your Git repository or uploaded directly to the server. However, it may be preferable to build assets on the live server if you want to change the way they are created. For example, HTML, CSS and JavaScript files are minified on production but not development environments. In that case, use the --save option for Gulp and all plugins, i.e.

npm install gulp --save

This sets Gulp as an application dependency in the "dependencies" section of package.json. It will be installed when you enter npm install and can be run wherever the project is deployed. You can remove the build folder from your repository since the files can be created on any platform when required.

Step 4: Create a Gulp Configuration File

Create a new gulpfile.js configuration file in the root of your project folder. Add some basic code to get started:

// Gulp.js configuration
var
  // modules
  gulp = require('gulp'),

  // development mode?
  devBuild = (process.env.NODE_ENV !== 'production'),

  // folders
  folder = {
    src: 'src/',
    build: 'build/'
  }
;

This references the Gulp module, sets a devBuild variable to true when running in development (or non-production mode) and defines the source and build folder locations.

ES6 note: ES5-compatible JavaScript code is provided in this tutorial. This will work for all versions of Gulp and Node.js with or without the --harmony flag. Most ES6 features are supported in Node 6 and above so feel free to use arrow functions, let, const, etc. if you’re using a recent version.

gulpfile.js won’t do anything yet because you need to …

Step 5: Create Gulp Tasks

On its own, Gulp does nothing. You must:

  1. install Gulp plugins, and
  2. write tasks which utilize those plugins to do something useful.

It’s possible to write your own plugins but, since almost 3,000 are available, it’s unlikely you’ll ever need to. You can search using Gulp’s own directory at gulpjs.com/plugins/, on npmjs.com, or search “gulp something” to harness the mighty power of Google.

Gulp provides three primary task methods:

  • gulp.task — defines a new task with a name, optional array of dependencies and a function.
  • gulp.src — sets the folder where source files are located.
  • gulp.dest — sets the destination folder where build files will be placed.

Any number of plugin calls are set with pipe between the .src and .dest.

Continue reading %An Introduction to Gulp.js%


Source: Sitepoint

Tongue Tied? Communicate Faster with Visuals

This article was sponsored by CloudApp. Thank you for supporting the partners who make SitePoint possible.

While you’re trying to figure out how to explain what you’re looking at, thousands of people are already capturing GIFs, videos, or images and sharing them with the world.

With CloudApp, you can replace bulleted lists and fragmented thoughts with easily understood visuals. Here are just a few ways teams use CloudApp to collaborate better:

  • A product manager takes a GIF of a bug, adds annotations, and attaches it to a support ticket for the dev team.
  • A founder records a video of a product demo, embeds it in an email, and shares it instantly with the board.
  • A sales lead embeds a presentation in the body of an email, letting them make a sale without any additional tabs or documents.
  • A software developer takes a GIF of a new feature and adds it to the changelog, clearly illustrating updates.

Don’t believe us? Here’s how it works.

Continue reading %Tongue Tied? Communicate Faster with Visuals%


Source: Sitepoint

Respect Always Comes First

The past years have been remarkable for web technologies. Our code has become modular, clean and well-defined. Our tooling for build processes and audits and testing and maintenance has never been so powerful. Our design process is systematic and efficient. Our interfaces are smooth and responsive, with a sprinkle of beautiful transitions and animations here and there. And after so many years, accessibility and performance have finally become established, well-recognized pillars of user experience.
Source: Smashing Magazine

Automated Browser Testing With The WebDriver API

Manually clicking through different browsers as they run your development code, either locally or remotely, is a quick way to validate that code. It allows you to visually inspect that things are as you intended them to be from a layout and functionality point of view. However, it’s not a solution for testing the full breadth of your site’s code base on the assortment of browsers and device types available to your customers.
Source: Smashing Magazine