Tag Archives: SOA

When thinking Desktop “first” still matters

By Clive Howard, Principal AnalystCreative Intellect Consulting

A few months back, I registered for Mobile World Congress 2015 in Barcelona. As an Analyst, there is a different registration process to the one used for regular attendees. This is so the organisers can validate that someone is a legitimate industry analyst. As well as entering a significant amount of personal data, additional information such as links to published work and document uploads are also required. Crucially, there are a number of screens to complete the registration and accreditation process. But more to the point, many different types of data must be entered – from single and multiple line text entry to file uploads. Some data (such as hyperlinks) requires cut and pasting.

I’m sure that I could have done this using a mobile phone but it would have taken a long time, been awkward and irritating and probably highly prone to mistakes. In short, I would never have considered doing something like this using my phone. Could I have used a tablet? Without a keyboard and mouse it would have been problematic, especially if the screen is small. Using a tablet only Operating System might also have had its problems in places: such as uploading documents from centrally managed systems. Actually I did use a tablet but one connected to a 20inch monitor, keyboard and mouse and running Windows. In that traditional desktop looking environment the process was relatively quick and painless.

Rumours of the desktop’s demise are greatly exaggerated

It is not just complex data entry scenarios such as this that challenge mobile devices. Increasingly I see people attach keyboards to their tablets and even phones. Once one moves beyond writing a Tweet or one line email many mobile devices start to become a pain to use. The reality of our lives, especially at work, is that we often have to enter data into complex processes. Mobile can be an excellent complement, but not a replacement. This is why we see so many mobile business apps providing only a tiny subset of functionality found in the desktop alternative; or they are apps that extend desktop application capabilities rather than replicate or replace them.

One vendor known for their mobile first mantra recently showed off a preview version for one of its best known applications. This upgrade has been redesigned from the ground up. When I asked if it worked on mobile the answer was no, they added (quite rightly) no one is going to use this application on a mobile device. These situations made me think about how over the last couple of years we have heard relentlessly about designing “mobile first”. As developers we should build for mobile and then expand out to the desktop. The clear implication has been that the desktop’s days are over.

This is very far from the truth. Not only will people continue to support the vast number of legacy desktop applications but will definitely be building new ones. Essentially, there will continue to be applications that are inherently “desktop first”. This statement should not be taken to mean that desktop application development remains business as usual. A new desktop application may still spawn mobile apps and need to support multiple operating systems and form factors. It may even need to engage in the Internet of Things.

The days of building just for the desktop safe in the knowledge that all users will be running the same PC environment (down to the keyboard style and monitor size) are gone in many if not the majority of cases. Remember that a desktop application may still be a browser based application, but one that works best on a desktop. And with the growth of devices such as hybrid laptop/tablet combinations, a desktop application could still have to work on a smaller screen that has touch capabilities.

It’s the desktop, but not as we know it

This means that architects, developers and designers need to modernise. Architects will need to design modern Service Orientated Architectures (SOA) that both expose and consume APIs (Application Programming Interfaces). SOA has been around for some time but has become more complex in recent years. For many years it meant creating a layer of SOAP (Simple Object Access Protocol) Web Services that your in-house development teams would consume. Now it is likely to mean RESTful services utilising JSON (JavaScript Object Notation) formatted data and potentially being consumed by developers outside of your organisation. API management, security, discovery, introspection and versioning will all be critical considerations.

Developers will equally need to become familiar with working against web services APIs instead of the more traditional approach where application code talked directly to a database. They will also need to be able to create APIs for others to consume. Pulling applications together from a disparate collection of micro services (some hosted in the cloud) will become de rigueur. If they do not have skills that span different development platforms then they will at least need to have an appreciation for them. One of the problems with mobile development inside enterprise has been developers building SOAP Web Services without knowing how difficult these have been to consume from iOS apps. Different developers communities will need to engage with one another far more than they have done in the past.

Those who work with the data layer will not be spared change. Big Data will affect the way in which some data is stored, managed and queried, while NoSQL data stores will become more commonplace. The burden placed on data stores by major increases in the levels of access caused by having more requests coming from more places will require highly optimised data access operations. The difference between data that is accessed a lot for read-only purposes and data which needs to be changed will be highly significant. We are seeing this with banking apps where certain data such as a customer’s balance will be handled differently compared to data involved in transactions. Data caching, perhaps in the cloud, is a popular mechanism for handling the read-only data.

Continuation of the Testing challenge

Testing will need to take into account the new architecture, design paradigms and potential end user scenarios. Test methodologies and tools will need to adapt and change to do this. The application stack is becoming increasingly complex. A time delay experienced within the application UI may be the result of a micro service deep in the system’s backend. Testing therefore needs to cover the whole stack – a long time challenge for many tools out there on the market – and the architects and developers will need to make sure that failures in third party services are managed gracefully. One major vendor had a significant outage of a new Cloud product within the first few days of launch due to a dependency on a third party service and they had not accounted for failure.

Legacy: Old technology that frightens developers (part 1)

By Clive Howard, Principal Practitioner Analyst, Creative Intellect Consulting

To developers the term legacy is often a dirty word meaning old software that is a pain to work with. Ironically, of course, it is the software that developers spend most of their time working with and developers made it what it is. The question all developers should ask is why legacy software is generally considered to be bad and what can be done to avoid this situation in future? After all an application released today will be legacy tomorrow.

Development teams do not set out to create bad software that will become difficult to maintain, support and extend. When allowed by their tyrannical masters architects and developers put a lot of work in upfront to try and avoid the typical problems of bloat, technical debt and bugs that they fear happening later. For some reason over the years these problems seem to have become inevitabilities.

The type of issues that make developers fear working with legacy include: technologies that are no longer fit for purposes; bloated codebases that are impossible to understand; different patterns used to achieve the same outcome; lack of documentation; inexplicable hacks and workarounds; and a lack of consistency plus many, many more. Most of these have their roots in a combination of design and coding.

Design theory does not always reflect reality

Architects aim to design clean, performant, scalable and extensible applications. Modern applications are complex involving multiple “layers” often distributed from a hardware perspective and including third party and/or existing legacy applications. Different components will frequently be the responsibility of different development teams working in different programming languages and tools.

For some time now the principle of separation has been applied to try and avoid the tightly coupled client/server applications of the past that were known to cause many legacy issues. This has gone under many guises, “separation of concerns”, n-tier, Service Orientated Architecture (SOA) and so on. They are all variants of the same concept that the more separated out the components of an application are the more flexible, scalable, extensible and testable that application will be. For developers having an application made of smaller parts makes it more manageable from a code perspective.

One of the classic scenarios is the interchangeable database idea. An application might start life using one database, but later on it needs to change to another. The concept of ODBC meant that it was easy to simply change a connection string in code and providing the new database had the same structure as the previous everything would continue without a hitch. The problem has been that what looks good theoretically doesn’t hold up in reality.

In the example of changing the database the reality often meant that there were a number of stored procedures, triggers or functions included in the database. Changing from one database to another meant porting these and that in itself can be a significant task. The time and therefore cost of such an activity resulted in the old database continuing. Hence today we find so many applications running unsuitable databases such as Access or Filemaker. A developer then has the frustration of having to work with inherently limiting and non-performant code.

No immunity from separatist design strategies

If we move forward to many of today’s architecture patterns such as SOA we still see similar problems. The concept of SOA is that components of an application become loosely coupled and so different parts of the application are less wedded to one another. Unfortunately within the separate services and consumers the same problems as outlined above can apply.

Worse than that is many service providers do not version their services. Google Maps will often bring out a new version of their service and clients calling the previous version will continue to function. However many others (social networks take note) do not follow this practice and frequently push out breaking changes to their service. This introduces a whole new problem into legacy applications whereby developers have to regularly go back into code and update it to work with the changes to the service.

Enabling the Mobile App? (Part 2)

Guest Post by Clive Howard, Principal Practitioner Analyst, Creative Intellect Consulting

Read Part 1: Enabling the Mobile App?

A new architecture for a new world

As seen in the previous post the real challenge is what lies behind the app. The solution, for many, is to move to a new type of architecture where code can be shared and re-used across many different use cases. Business logic needs to be contained in one place which all client applications reuse. Client applications become essentially UI developed for the specific environment in which they run (phone, tablet, desktop and so on).

Developers only need to maintain the majority of an application’s functional requirements within a single code base around which they can build a suite of tests (such as Unit Tests), implement security and manage a single deployment process. Like the business case for hybrid the case for such a new architecture is compelling as it reduces time and cost over the Application Lifecycle. In addition it becomes faster and easier to develop and deploy new clients.

One such approach is Service Orientated Architecture (SOA) which makes use of a middleware layer of Web Services. Web Services have been around for some time and there are a number of frameworks and tools for creating them both from vendors such as Microsoft and IBM and Open Source solutions. The traditional Web Service used the Simple Object Access Protocol (SOAP) which is still popular within the enterprise today. Whilst SOAP had developer benefits such as the “WSDL” which made it easy to discover service methods and data structures it requires a lot of XML formatted data to be passed between client and server.

The world of mobile has low bandwidth networks and high cost data charges and so a far more light weight data transfer method was needed. The most popular of these emerged from the Representational State Transfer (REST) based approach which uses JavaScript Object Notation (JSON) to structure data.

Any developer working with SOA to support mobile apps must consider the size and speed of data over the wire or risk users incurring prohibitive usage costs and poor app performance. Therefore when building for devices it is important to not just consider the code within the app itself but also the communication with the server and the performance and security of the server based code.

Developers need to consider their responsibilities in creating backend services. An example is that in traditional client/server development it has been easy to have databases return large record sets to the client (in the event that any of those fields are needed in future). When data starts moving across mobile networks it is not just the format that is important but the volume. If an app only needs five columns from a database table then only return five as that will keep the data packet size to a minimum.

Use tools such as mobile simulators to mimic different types of network and bandwidth availability to check the impact on app performance and optimise accordingly. Don’t forget the “no network connection” scenario and handling any offline data change operations.

Choosing the right tools and frameworks should help developers create, analyse and optimise all areas of development and so create secure, high performing and usable apps.

Building for success beyond today’s mobile needs

The challenge for organisations is how fast they can move to this type of architecture to support the burgeoning suite of mobile apps that their business require. Many will look for ways to stop-gap the situation to roll out mobile apps whilst addressing the bigger architectural shift. That approach may require wasted effort as interim solutions are scrapped later but could provide useful learning opportunities. Existing technology choices will dictate how and therefore how fast this transition can be made.

As we move into a rapidly changing world of devices it would behove organisations to adopt technology stacks that enable not just the ability to share data and logic with multiple user endpoints but also to be deployed to the Cloud.

Smart IT functions that get this transition right will deliver significant competitive advantage and cost savings for the businesses. We are only at the beginning of a wave of new devices and functional requirements that extend applications beyond the company firewall.

This is especially relevant within the enterprise which has historically been able to move slowly in adopting IT trends. Now (and increasingly going forward) they will be under pressure from inside and outside the organisation to move far more quickly. The decisions they make now to support mobile may have repercussions for some time to come.

In Pursuit of Sustainable Legacy Modernization (part 2)

Guest contributor, Paul Herzlich from analyst firm Creative Intellect Consulting

(Part 1: http://unifaceinfo.com/in-pursuit-of-sustainable-legacy-modernization-part-1/)

Although discussions of modernization often center on a world of technically dazzling possibilities, the reality for many organizations is much less exciting. Long before they consider their ambitions to be state-of-the-art, they have to confront the fact that their applications are probably lacking or are even incompatible with only moderately recent technologies that would allow them to participate in the interconnected world of composite applications, like object orientation, SOA, web services and XML. Forget about whether they can make use of the latest shiny technological tricks. They realize that living outside the technological mainstream is costing them in a whole host of ways:

      • They’re at risk while they depend on obsolete hardware and unsupported system software, databases, and middleware.
      • They maintain separate development tools for software development with a raft of side effects: separate licenses and less capable, less integrated and less productive tools than those available for mainstream languages.
      • They and their IT service partners find it difficult to recruit staff; legacy languages and platforms require niche skills that lack pulling power for new recruits.

For many managers, the straightforward solution to modernization is to rewrite applications in a trendy language on modern platform or to replace them with a packaged application. Rewriting or replacement are both valid but can be expensive and highly risky options. There are incremental alternatives.

A typical stepping-stone to modernization involves wrapping existing core business systems in modern interfaces and treating the legacy systems as if they were a black box. You write an interface that transforms data and commands between new and old technologies. The black box modernization strategy is popular but has architectural limitations and negative side effects on IT processes.

      • The distribution of logic across front and back ends is not optimal; often business logic has to be duplicated. Once it is duplicated, you have duplicate maintenance and a new risk of errors when new and old logic mistakenly diverge.
      • The timing of back-end processes – many of which are batch – doesn’t match the interactive nature of the modernized front-end. The black box implementation may appear awkward or incomplete.
      • Without extensive back-end changes, you are still largely limited to data catered for in your back-end data model.
      • It places extra burdens (which mean costs) on IT processes. You multiply the development and test environments, skillsets and tools required.
      • Although in theory you keep your core applications, often they still require modification. For example, if you add mobile access, you may need to extend your legacy system to cater for new locational data. All this adds complexity, effort and risk to coding, integration and testing.