Software Architecture Metrics Interview With Neal Ford

Neal Ford is Director, Software Architect, and Meme Wrangler at Thoughtworks, a software company and community of passionate, purpose-led individuals who think disruptively to deliver technology to address the most demanding challenges, all while seeking to revolutionize the IT industry and create positive social change. Neal is an internationally recognized expert on software development and delivery, especially at the intersection of Agile engineering techniques and software architecture. He has authored magazine articles, nine books (and counting), dozens of video presentations, and spoken at hundreds of developers’ conferences worldwide. Neal’s topics include software architecture, continuous delivery, functional programming, and cutting-edge software innovations, and he includes a business-focused book and video on improving technical presentations. 

Apiumhub had the opportunity to do an interview with Neal Ford, a Global Software Architecture Summit speaker, to get to know what metrics he normally uses and learn more about his chapter in the Software Architecture Metrics book that was recently published by O’Reilly.

Continuous Architecture Principles and Goals

Creating and maintaining software architecture that remains sustainable over time is a challenge for software architects and developers. Software Architects used to meet every requirement, provide every feature, and plan every system component at once with big software architecture upfront, which involves completing and perfecting architectural designs before implementations are started. Alternatively, teams might produce emergent architectures, where development teams start delivering functionality and let architectural designs emerge with little upfront planning. Unfortunately, none of those methods is consistently successful in delivering sustainable architecture. 

Continuous Architecture Principles

As for “Continuous Architecture,” this is about using the appropriate tools to make the right decisions and support Continuous Delivery, Continuous Integration, and Continuous Testing. “Continuous Architecture” is an approach based on a toolbox — not a formal process. 

How to Minimize Software Development Cost

One of the typical questions when you look at the project briefing is how to minimize software development costs. Of course, there are many ways of doing it without sacrificing quality; however, don’t forget it is all about trade-offs. 

Collaborate With Software Experts

When we talk about reducing software development costs, outsourcing is one way to achieve this. Fortunately, you can find many exceptional development partners, such as Apiumhub, and you no longer have to worry about geographical or knowledge limitations. In addition, partnering with a software agency allows you to leverage a large pool of resources and save on many expenses, such as paid vacations and sick leaves, insurance, recruitment costs, budgets associated with onboarding and training, and more. At first sight, let’s say hourly, or monthly rates may seem expensive. Still, if you really choose experts, you start developing your project from the start, applying best practices, thinking about software architecture, infrastructure, etc. Note that it is highly important to provide detailed project scope and requirements before starting a collaboration.

Keystone Interface and Keystone Flag

As we all know, software developers may ease their development process by integrating their work as often as they can. It is also known that releasing frequently into production helps a lot. But developers and project stakeholders don’t want to expose half-developed features to their users. So, what happens in this case? 

A useful technique to deal with this issue is to build the backend, integrate, but don’t build the user interface. The feature can be integrated and tested, but the UI is held back with the help of a keystone ( more info you may find in Martin Fowler’s blog ) and added only when the feature is completed, showing it to the users.

Backlog Refinement or Backlog Grooming

In order to get everybody on the team aligned, teams plan the work that should be done in the next sprint. The purpose of sprint planning is to agree on a goal for the next sprint and the set of backlog items to achieve it. Sprint planning is about prioritizing backlog items and agreeing on the number of backlog items in the sprint based on team capacity. Sprint planning kicks off every sprint. Scrum suggests investing two hours per sprint week in planning sessions. Experienced teams will be able to cut this down to an hour per week or less. Mostly because they are comfortable with less detail upfront and more uncertainty in their definition of ready. The meeting is attended by the entire team. Outside stakeholders are invited if they can provide additional expertise for specific backlog items. Today we will discuss a very important topic: backlog refinement vs backlog grooming.

Backlog Refinement or Backlog Grooming

Backlog refinement or backlog grooming stands for keeping the backlog up to date and getting backlog items ready for delivery. This involves rewriting backlog items to be more expressive, deleting obsolete ones, re-assessing the relative priority of stories, splitting big items into smaller ones, resorting them, and correcting estimates in light of newly discovered information. 

Trunk-Based Development

When coding an application, it is important to remain in sync with the other engineers working on the project. One strategy that helps a team stay in sync with codebase changes is trunk-based development. When employing trunk-based development, the developers working on a project make all their code changes in a common branch known as "trunk". There are numerous benefits to developing with this approach, which we will discuss in this article.

What Is Trunk-Based Development?

Trunk-based development is a version control management practice where developers merge small, frequent updates to a core trunk or main branch. It’s a common practice among DevOps teams and part of the DevOps lifecycle, as it streamlines merging and integration phases. In fact, trunk-based development is a required practice of CI/CD. Developers can create short-lived branches with a few small commits compared to other long-lived feature branching strategies. As codebase complexity and team size grow, trunk-based development helps keep production releases flowing.

Incident Management Process and Tools

Incident management is one of the most critical processes a software development team has to get right. Service outages can be costly to the business and teams need an efficient way to respond to and resolve these issues quickly. For example, many organizations report downtime costing more than 300.000 euros per hour, according to Gartner. For some web-based services, that number can be dramatically higher. In this article, we will discuss how critical it is to have a reliable method to prioritize incidents, how to get to resolution faster, and offer better service for the end-users.

What is Incident Management?

First of all, what is incident management exactly? It is the process used by DevOps and software development teams to respond to an unplanned event or service interruption and restore the service to its operational state.

Machine Learning Interview With Gema Parreño, Lead Data Scientist at Apiumhub

Today we have interviewed our Gema Parreño, Lead Data Scientist at a software development company, Apiumhub, where she develops Data-Driven Solutions. She is passionate about the intersection of machine learning and games, and has had her own startup, contributed to the open source space in the StarCraft machine learning project, and had an experience at Google Brain for Stadia.

Gema gives a talk about Mempathy as an AI Safety and Alignment opportunity, and we wanted to dig deeper and find out more about it, as well as how the idea arose to use it for implementation of Safety and Alignment techniques.

Choosing a Custom Software Development Company That Delivers: 11 Key Factors To Account For

Finding the right custom software development company that delivers is a complex challenge. It is like hiring a crew to build your new house. Triple check with whom you will work with over the next few months or years as it has a direct impact on your success or failure. As we know at the heart of every successful organization is a quality custom software solution, therefore we decided to create a guide with key factors on how to choose a custom software development company that delivers.

Key Factors to Take Into Account When Choosing a Custom Software Development Company That Delivers

1. Review Portfolios

Explore the prospective software development company’s previous projects. Picking a company with a proven experience in a specific industry, technology, or type of project can be advantageous as they are familiar with the challenges that can occur during a custom software development life cycle for a specific product or feature. When reviewing portfolios, also consider the size of your project. You’ll notice if any custom software development company prefers working with certain-sized projects.

How To Create Software Architecture Culture In Your Team

There are some qualities that differentiate average from high performing software development companies and attitude towards the software architecture culture is one of them.

Software Architecture Culture

The culture is better and the results are much better in teams where software developers and architects pay particular attention to software architecture quality. In teams where it’s all about delivering tickets as quickly as possible, the culture and results are poorer. It is proven! However, it is very important to highlight that where there is too much focus on software architecture and not enough on delivery is hugely counter-productive as well. Both valuing software architecture and striving for continuous delivery are critical.

Craft Conference Is Coming Back Online

In Apium Academy, we help developers to get better through practical workshops, courses, and events. We believe in constant improvement and growth. Recently we have published an article about Top 5 Online Events On New Technologies And Trends 2021 and we decided to write a post specifically about Craft Conference pointing out the most interesting talks, workshops and speakers because we truly believe this conference is worth your time! 

We regularly host and participate in software architecture events and today we talk about worth-attending talks at Virtual Craft Conference. In this conference we are also featured as partners and our team will be there and we really hope to see you there as well!

Data Mining: Use Cases, Benefits, and Tools

In the last decade, advances in processing power and speed have allowed us to move from tedious and time-consuming manual practices to fast and easy automated data analysis. The more complex the data sets collected, the greater the potential to uncover relevant information. Retailers, banks, manufacturers, healthcare companies, etc., are using data mining to uncover the relationships between everything from price optimization, promotions, and demographics to how economics, risk, competition, and online presence affect their business models, revenues, operations, and customer relationships. Today, data scientists have become indispensable to organizations around the world as companies seek to achieve bigger goals than ever before with data science. In this article, you will learn about the main use cases of data mining and how it has opened up a world of possibilities for businesses.

Today, organizations have access to more data than ever before. However, making sense of the huge volumes of structured and unstructured data to implement improvements across the organization can be extremely difficult due to the sheer volume of information.

4 Key Data Modeling Tools

Every day quintillion bytes of data are created, and this pace is accelerating at a daily rate. With so much information at our disposal, it is becoming increasingly important for organizations and enterprises to access and analyze the relevant data to predict outcomes and improve services. In order to access the data properly and extract the most out of it, it is essential to model your data correctly. Data modeling tools become critical here because they enable organizations to make data-driven decisions and meet varied business goals.

What is Data Modeling?

Data modeling is the process of visualizing and representing data for storage in a data warehouse. The modeling itself can include diagrams, symbols, or text to represent data and the way that it interrelates. Because of the structure that data modeling imposes upon data, the process of data modeling subsequently increases consistency in naming, rules, semantics, and security, while also improving data analytics. The goal is to illustrate the types of data used and stored within the system, the relationships among these data types, the ways the data can be grouped and organized and its formats and attributes.

Data Lake Architecture

With the rapid advancement in technologies, companies are now in search of a better way to ensure that organizational data and information are kept safe and organized. One way through which businesses are doing this is through the use of Data Lakes to create a centralized place management infrastructure that allows every organization to manage, store, analyze and classify data.

The concept of Data lake architecture has recently become a hot topic. These days, businesses use data to define their internal business objectives and metrics. Data Lakes offer agile analytics to measure you are continually evolving business. Data lakes really became the cornerstones of modern big data architecture

20 Business Intelligence Tools and Use Cases

With more and more data available, it’s getting more difficult to focus on the information we really need and present it in an actionable way and that’s what business intelligence is all about. In this article, we will talk about Business Intelligence tools, benefits, and use cases. 

What Is Business Intelligence

Business intelligence (BI) comprises the strategies and technologies used by enterprises for the data analysis of business information. BI technologies provide historical, current, and predictive views of business operations. It has become a necessary tool in the era of big data.

Data-as-a-Service (DaaS) Benefits and Trends

Businesses across the globe highlight DaaS not only as a unique revenue channel but also as a path to reshape the business world through competitive intelligence. The increasing importance of data and analytics is driving the importance of data as a service. External DaaS services enable companies to easily access external data. Internal DaaS services make it easier for companies to democratize analytics and empower their business users. So, in this article, we will discuss DaaS benefits and the latest trends.

What is DaaS?

Data-as-a-service (DaaS) is a data management strategy and a deployment model that focuses on the cloud to deliver a variety of data-related services such as storage, processing, and analytics. DaaS leverages the popular software-as-a-service (SaaS) paradigm, through which customers are able to use cloud-based software applications delivered over the network rather than deploying dedicated hardware servers for a specific set of tasks on a specific set of data.

Software Development Project Postmortem

If you’ve been a part of any software development project, you know things don’t always go as planned. In theory, projects have two possible extreme outcomes: success or failure. In reality, all projects will have a blend of success and failure factors when a large number of factors are considered. And in Apiumhub we believe that the best way to work through what happened during an incident and capture any lessons learned is by conducting a software development project postmortem.

What Is a Software Development Project Postmortem?

Software development project postmortem brings people together to discuss the details of an incident: why it happened, its impact, what actions were taken to mitigate it and resolve it, and what should be done to prevent it from happening again.
A project post-mortem is performed at the conclusion of a project to determine and analyze elements of the project that were successful or unsuccessful, the process as lessons learned.
Project post-mortems are intended to inform process improvements that mitigate future risks and to promote iterative best practices.