Golang Vs. Python: Which Programming Language Will Suit You?

When you are on board to kick off the new project, the team of developers goes over several meetings to decide the suitable language. However, selecting the best programming language will always be a crucial decision for custom software development services.

Before moving further for any project, the development team discusses the requirements for their development project. Albeit, we have a plethora of options to choose from, such as PHP, JavaScript, Golang, Python, and C++. Many times, Go and Python becomes the first choice among the developers.

Top 6 Web Development Languages To Use

Today, web development is seemingly a promising job. If you are a newcomer in this field, the nuisance here is where to start or which web development languages you should learn.

Inevitably, this depends much on what type of project you get involved in, whether you feel more pleasant to work with back-end or front-end technologies and whether you are good enough at mathematics and logic to learn programming skills.

Traceroute Command in Python and How To Read a Traceroute

On operating systems like Windows or Linux, there is an invaluable tool called the traceroute command (on Windows, the equivalent command is called tracert). This command-line tool enables system administrators or network engineers to troubleshoot common networking issues.

Administrators use a traceroute to probe for bottlenecks whenever a user complains that connection to a website or server is slow. In addition, the traceroute command is used if a server is unreachable as it will show which particular part of the network route is problematic.

How to Vaults and Wallets for Simple, Secure Connectivity

This is the third in a series of blogs on data-driven microservices design mechanisms and transaction patterns with the Oracle converged database. The first blog illustrated how to connect to an Oracle database in Java, JavaScript, Python, .NET, and Go as succinctly as possible. The second blog illustrated how to use that connection to receive and send messages with Oracle AQ (Advanced Queueing) queues and topics and conduct an update and read from the database using all of these same languages.  The goal of this third blog is to provide details on how to secure connections in these same languages as well as convenience integration features that are provided by microservice frameworks, specifically Helidon and Micronaut.

When making secure connections to Oracle databases there are two items to consider, the wallet and the password. We will discuss and provide examples of both in this blog.

Developing Event-Driven Microservices

This is the second in a series of blogs on data-driven microservices design mechanisms and transaction patterns with the Oracle converged database. The first blog illustrated how to connect to an Oracle database in Java, JavaScript, Python, .NET, and Go as succinctly as possible. The goal of this second blog is to use that connection to receive and send messages with Oracle AQ (Advanced Queueing) queues and topics and conduct an update and read from the database using all of these same languages.

Advanced Queuing (AQ) is a messaging system that is part of every Oracle database edition and was first released in 2002. AQ sharded queues introduced partitioning in release 12c and is now called Transaction Event Queues (TEQ).

Pandas: Pandas Tutor

This article focuses on the excellent tool for learning and debugging Pandas program known as “Pandas Tutor”. The “Pandas Tutor” tool visualizes how the Pandas code transforms the data, which helps in learning Pandas quickly, also we can very effectively use this tool for debugging our programs. It’s a great tool and visualization is super awesome, let’s learn this tool by executing a few basic Pandas programs.

To access Pandas Tutor online, please use this link.  

A Beginner’s Guide to High-Performance Computing in Python

Ever since the Python programming language was born, its core philosophy has always been to maximize the readability and simplicity of code. In fact, the reach for readability and simplicity is so deep within Python's root that, if you type import this in a Python console, it will recite a little poem:

    Beautiful is better than ugly. Explicit is better than implicit. Simple is better than complex. The complex is better than complicated. The flat is better than nested. Sparse is better than dense. Readability counts...

Simple is better than complex. Readability counts. No doubt, Python has indeed been quite successful at achieving these goals: it is by far the most friendly language to learn, and an average Python program is often 5 to 10 times shorter than equivalent C++ code. Unfortunately, there is a catch: Python's simplicity comes at the cost of reduced performance. In fact, it is almost never surprising for a Python program to be 10 to 100 times slower than its C++ counterpart. It thus appears that there is a perpetual trade-off between speed and simplicity, and no programming language shall ever possess both.

But, don't you worry, all hope is not lost.

Taichi: Best of Both Worlds

The Taichi Programming Language is an attempt to extend the Python programming language with constructs that enable general-purpose, high-performance computing. It is seamlessly embedded in Python, yet can summon every ounce of computing power in a machine -- the multi-core CPU, and more importantly, the GPU.

We'll show an example program written using taichi. The program uses the GPU to run a real-time physical simulation of a piece of cloth falling onto a sphere and simultaneously renders the result.

Writing a real-time GPU physics simulator is rarely an easy task, but the Taichi source code behind this program is surprisingly simple. The remainder of this article will walk you through the entire implementation, so you can get a taste of the functionalities that taichi provides, and just how powerful and friendly they are.

Before we begin, take a guess of how many lines of code this program consists of. You will find the answer at the end of the article.

Algorithmic Overview

Our program will model the piece of cloth as a mass-spring system. More specifically, we will represent the piece of cloth as an N by N grid of point-masses, where adjacent points are linked by springs. The following image, provided by Matthew Fisher, illustrates this structure:

The motion of this mass-spring system is affected by 4 factors:
  • Gravity
  • Internal forces of the springs
  • Damping
  • Collision with the red ball in the middle
For the simplicity of this blog, we ignore the self-collisions of the cloth. Our program begins at the time t = 0. Then, at each step of the simulation, it advances time by a small constant dt. The program estimates what happens to the system in this small period of time by evaluating the effect of each of the 4 factors above, and updates the position and velocity of each mass point at the end of the timestep. The updated positions of mass points are then used to update the image rendered on the screen.

Getting Started

Although Taichi is a programming language in its own right, it exists in the form of a Python package and can be installed by simply running pip install taichi.

To start using Taichi in a python program, import it under the alias ti:

import taichi as ti

The performance of a Taichi program is maximized if your machine has a CUDA-enabled Nvidia GPU. If this is the case, add the following line of code after the import: ti.init(arch=ti.cuda)

If you don't have a CUDA GPU, Taichi can still interact with your GPU via other graphics APIs, such as ti.metal, ti.vulkan, and ti.opengl. However, Taichi's support for these APIs is not as complete as its CUDA support, so, for now, use the CPU backend: ti.init(arch=ti.cpu)And don't worry, Taichi is blazing fast even if it only runs on the CPU. Having initialized Taichi, we can start declaring the data structures used to describe the mass-spring cloth. We add the following lines of code:

Python
 
 
    N = 128
    x = ti.Vector.field(3, float, (N, N)) 
    v = ti.Vector.field(3, float, (N, N))


Building an ETL Pipeline With Airflow and ECS

Each day, enterprise-level companies collect, store and process different types of data from multiple sources. Whether it’s a payroll system, sales records, or inventory system, this torrent of data has to be attended to. 

And if you process data from multiple sources that you want to squeeze into a centralized database, you need to:

The 10 Commandments for Performing a Data Science Project

In designing a data science project, establishing what we, or the users we are building models for, want to achieve is vital, but this understanding only provides a blueprint for success. To truly deliver against a well-established brief, data science teams must follow best practices in executing the project. To help establish what that might mean, I have come up with ten points to provide a framework that can be applied to any data science project.

1. Understand the Problem 

The most fundamental part of solving any problem is knowing exactly what problem you are solving. Make sure that you understand what you are trying to predict, any constraints, and what the ultimate purpose for this project will be. Ask questions early on and validate your understanding with peers, domain experts, and end-users. If you find that answers are aligning with your understanding, you know that you are on the right path. 

Loading Thousands of Tables in Parallel With Ray Into CockroachDB Because Why Not?

I came across an interesting scenario working with one of our customers. They are using a common data integration tool to load hundreds of tables into CockroachDB simultaneously. They reported an issue that their loads fail intermittently due to an unrecognized error. As a debug exercise I set out to write a script to import data from an http endpoint into CRDB in parallel. 

Disclosure: I do not claim to be an expert in CRDB, Python, or anything else for that matter. This is an exercise in answering a why not? question more so than anything educational. I wrote a Python script to execute an import job and need to make sure it executes in parallel to achieve the concurrency scenario I've originally set out to do. 

Interview With FastAPI Creator – Sebastian Ramirez

Here is a quick recap of the knowledge shared by Sebastián Ramírez, an open-source enthusiast and the creator of FastAPI, Typer, and SQLModel. He has been building products and custom solutions for data and machine learning systems and has led teams of developers around the world. 

Hope you enjoy the interview! Let's go.

The Interview

Question: You have an impressive array of interests: frontend development, backend development, DevOps. What do you think about the "full-stack developer" concept, chased after by most tech companies? Is it a reasonable goal for most developers to pursue, or does it have any downsides?

Answer: Thanks! Yes, I’ve had a lot of interests while solving different problems, and I ended up learning several things from different sub-fields (e.g. backend, frontend, machine learning).

Mutual TLS With gRPC Between Python and Go Services

This tutorial walks you through the process of connecting services written in Python and Go via the gRPC framework using mutual TLS authentication. I assume that the reader is somewhat familiar with Python/Django and Go development and so omit most of the boring stuff like bootstrapping virtualenv with the Django app or how to “manage.py runserver” it.
The final code can be found here.

Introduction

I have an old system in Python undergoing a significant overhaul. It’s a two-component system:

How Bokeh Secures Its Open-Source Repositories

Open-source is everywhere, it is one of the driving forces of software innovation from the academic to the enterprise world (75 percent of codebases audited by Synopsys in the 2021 OSSRA report rely on open-source components). Its prevalence in commercial software is reaching unprecedented levels, to the extent that the European Commission has recently identified it as a public good in a recent study assessing its impact on the region’s economy.

But the interstitial nature of open-source in modern software also makes it a subject of security and compliance concerns, as it is capable of exposing organizations that use it to a host of unknown risks and vulnerabilities. Most discussions we are hearing today around security in this space are focused on the identification, fixing, and remediation of vulnerabilities — all seen from the “consumer” perspective.

Node.js Vs. Python: Pros, Cons, and Use Cases

When choosing a programming language for the backend development, your choice determines how the product will operate, scale, and fulfill user demands.

One of the most common is the dilemma of Node.js vs. Python. The two options are hugely popular and have their pros and cons. We work with both and are here to compare their advantages and disadvantages and help you to decide which one is better for your project.

Inserting Dynamic Data Into Jekyll Static Sites Using Python or Bash

Jekyll, the static site generator, uses the _config.yml for configuration. The configurations are all Jekyll-specific. But we can also define variables with our own content in these files and use them throughout our website. In this article, I’ll highlight some advantages of dynamically creating Jekyll config files.

On my local laptop, I use the following command to serve my Jekyll website for testing:

Container Creates Instant Database API

In this tutorial, we’ll show how to use ApiLogicServer to create, customize and run a Database based API. API Logic Server is an open-source Docker container. With the commands shown below, you get:

  • Working Software, Now:
    • A database API server, to unblock UI development.
    • A multi-page web app, to engage Business Users — early in the project.
    • Declarative logic using unique spreadsheet-like rules — 40X more concise than code, extensible with Python — for remarkable business agility.
  • Customizable projects, using a standard language and tools. Operate in a cleanly isolated, containerized environment that matches your deployment architecture.

TL;DR — Create Database API and Basic Web App

Create the sample project in a minute or two, as follows. With Docker started, enter these Terminal commands (Windows, use Powershell):

Text Preprocessing Methods for Deep Learning

Deep Learning, particularly Natural Language Processing (NLP), has been gathering a huge interest nowadays. Some time ago, there was an NLP competition on Kaggle called Quora Question insincerity challenge. The competition is a text classification problem and it becomes easier to understand after working through the competition, as well as by going through the invaluable kernels put up by the Kaggle experts.

First, let’s start by explaining a little more about the text classification problem in the competition. 

Externalizing Your Configurations With Vault for Scalable Deployments

Table of Contents:

  • Introduction
    1. The Solution
  • Setting Up Vault
    1. Creating API Admin Policy
    2. Creating Read-Only user policy
    3. Creating Token attached with API read-only policy
  • 1. Linux Shell Integration
  • 2. Java Integration
  • 3. Python Integration
  • 4. Nodejs Integration
  • 5. Ansible Integration
  • Conclusion

Introduction:

To implement automation for microservices or applications deployed to a large number of systems, it becomes essential to externalize the configurations to a secure, scalable, centralized configuration store. This is necessary to be able to deploy the application in multiple environments with environment-specific parameters without requiring human intervention and without requiring modification to the core application during automated deployments, scaling, and failover recoveries.

Besides the fact that manually managed configurations involve the risk of human error, they are also not scalable for 24x7 large-scale deployments, particularly when we are deploying several instances of microservices across various infrastructure platforms.