Category Archives: Blog

DSPs provide access to the DOM layer

DSP JS API function: uniface.field.getBoundElement()

Putting application developers in control of the presentation layer

HTML5 already provides a powerful set of form controls out of the box, and its functionality is continuing to grow and mature. To get access to all that functionality, application developers need to be able to interface directly with the controls. Uniface 10.3 Dynamic Server Pages provides exactly that capability.

Before we go into detail, let’s see how Uniface’s support for web technology has evolved over time to give developers more and more control over their applications.

The beginning: Uniface Server Pages

Uniface 7.2 provided our first functionality for the web: The Static Server Page, also known as the Uniface Server Page (USP). USPs enable binding between the Uniface server-side data structure and an HTML client side. They handle communication between client and server in a very simple way: the server generates an HTML page complete with data substituted into the HTML, and the browser simply displays that HTML page.

Updates are initiated by the browser via a standard HTML submit, after which the server will load the updates and reconnect them with the back end. After that, the server again generates a full-page response with all changes handled, and the whole process starts again from the beginning.

A leap forward: Dynamic Server Pages

Uniface 9.4 introduced the Dynamic Server Page (DSP) allowing Uniface developers to create rich internet applications (RIAs). The biggest difference between a USP and a DSP is the absence of full-page refreshes in the DSP. Obviously, data is still sent to the server and received back by the client, but instead of the whole page being refreshed, only the affected data is returned and merged into the page displayed in the browser. All communication is handled by the Uniface DSP and programming is as easy as writing some ProcScript in a trigger.

In addition, Uniface 9.4 DSPs provide out-of-the-box client-side syntax checking, which, in case of a syntax error, avoids a round trip to the server. A DSP consists of a server part and a client part. The client part has a copy of the component definition, which is what allows the client to perform syntax checking.

Introducing the ability to manipulate client data

Initially, Uniface application developers could write business logic solely for the server, using ProcScript; the client side was closed to them.

Uniface 9.5 changed that by introducing the JavaScript (JS) API. Uniface application developers can use this to access the Uniface data structure on the client, and can manipulate Uniface data there using JS, without the need to go back to the server. The JS API provides access to field values, properties, valreps, creation and deletion of occurrences, etc.

Application developers take charge of the presentation layer

With Uniface 10.3, we have now also opened up the presentation layer of the client: the Document Object Model (DOM) layer. Using a simple function, a Uniface data object can now get a reference to its bound element in the DOM layer, allowing Uniface developers to access DOM elements in the context of that field, its occurrences, and its instance. The function is: uniface.field.getBoundElement(ViewId)

From the bound DOM element, it is possible to navigate to sibling and parent elements. In case of an AttributesOnly field, the same technique can be used to navigate to child elements. This gives Uniface developers full control of the DOM, allowing integration of third-party JS libraries that integrate at DOM level.

An example

In the following code example we will use the new JS function to change the default reporting of client side syntax errors.

The webtrigger onSyntaxError is the trigger that gets fired the moment the client encounters a syntax error. The default way for Uniface to respond in this situation is to set an error class on the element bound to the field that is in error. CSS would then style it appropriately. The code below overwrites the default behavior and sets the error class to the parent element of the element bound to the field:

webtrigger onSyntaxError
javascript
  // Get field in the data structure
  var field = this;
  // Get bound element of field in layout
  var fieldEl = field.getBoundElement();
  // Get parent of element
  var parentEl = fieldEl.parentElement;
  // Set error class on parent element instead of the field element itself
  parentEl.classList.add(“-highlight-error-“);
  // Prevent default error reporting
  return false;
endjavascript
end

Conclusion

The getBouldElement() function is simple to use and provides full access to the DOM layer of the browser. It opens up communication with any JS technology that needs to interface on the presentation layer.

Uniface Security

Blog by Jason Huggins

UNIFACE SECURITY  

The latest releases of Uniface 9 and 10 mark a significant milestone in the enhancement of security, both under the covers along with new functionality to secure applications.

I believe that in practice all organizations need to protect aspects of business confidentially, competitive edge, adhere to applicable privacy regulations and prevent data theft/manipulation. Protecting data is paramount for practically everyone. It can feel like the wild west at times, with attacks coming from all directions, for example an employee, a contractor/visitor, a cyber-criminal, malware/ransomware, accidental hackers, curious observers… the list goes on! Whether the data breach is internal, external, malicious or accidental, the risk should be understood, assessed and addressed.

The statistics of count, size and cost to a victim show that global data breaches have been on a continual increase each year. The current average cost of breaches is in the millions of dollars range, with total costs per year globally in the billions. Breach sizes have ranged from tens of millions of confidential records up to many billions of lines of data.

Predictions suggest that a clear majority of enterprise traffic will be encrypted throughout 2019. It is important for Uniface to support this, whilst making it an easy as part of the development and deployment platform to utilize.

What is the threat?

There are many threats to data security for which network security exposes a key flaw. There is an inherent weakness in the standard TCP/IP network model and IPv4 because none of the layers include security features as such. The protocols ensure very reliable transmission of data however do not fully ensure data integrity or confidentially.  It is extremely easy to sniff and tamper with the data in real time.

But wait, what about IPv6 you may ask? Well IPsec, a security protocol, is built into the IPv6 standard however it is not mandatory. The official launch of IPv6 was in 2012 however IPv4 is still the dominant protocol covering around 75% of deployments. The rate of adoption appears to be slowing however this does not in any way mean that IPv6 will not become the dominant standard, it may just take a little longer than expected. IPSec within IPv6 will not necessarily become the drop-in solution to the security hole. It is still valid to apply alternative or additional mechanisms to secure the transmitted data. The Uniface implementation means that the application can with ease, reliably ensure encryption is applied whatever the underlying IPv’N’ network infrastructure implementation and protocol support may be.

What’s new in network security?

Uniface now has a cryptography layer added to its network stack. The implementation is a TLS layer built on top of the standard TCP driver. The TCP driver itself has been refactored yielding several improvements. The new TLS driver utilises OpenSSL libraries. OpenSSL, often referred to as the

‘Swiss Army Knife’ of cryptography, is well maintained/supported, has excellent platform coverage and is backed by major organizations. It implements both Pre-shared key (PSK) and Asymmetric Certificate/Key pair verification, the later providing greater levels of security. The cryptography methods supported, called ciphers, are those of OpenSSL, however by default Uniface will only list the stronger ciphers.

The new driver encrypts the network traffic, including IPv6, between Uniface processes encompassing both shared and exclusive servers. A key feature supported by the TLS driver is ‘Peer Name Verification’, which helps mitigate compromises such as ‘Man in the Middle’ attacks.

Configuration is very straight forward matching the typical driver approach, with familiar mnemonics such as ‘tls:’ & ‘USYS$TLS_PARAMS’. The configuration and various possibilities are well documented in the help.

Considerations

Security is a shared responsibility spanning development and operations. Being more of configuration exercise, developers will see little change. The extra processing needed to encrypt/decrypt may have an influence e.g. transaction size and client vs server processing could become a consideration. Note: Uniface benchmarks match the published OpenSSL results.

Operations should understand security, TLS and encryption, ensuring to pick ciphers that adhere to internal policies whilst maximising performance. The ‘pathscrambler’ is essential and must be used to safeguard the TLS configuration settings.

The TLS driver is simple to use and should be considered an essential priority for most.

COUNTERS

Global Objects

There are many types of Global Objects, like Messages, Global Procs and Keyboard Translation Tables, to name a few. The Uniface 9 IDF and the Uniface 10 IDE provide editors to maintain the definitions of those objects. You could consider those as the definition-time source objects. Successful compilation of those source objects results in compiled objects, their run-time counterparts.

The compiled objects are static objects. User applications can use them, but they have no way of changing them. To change them, you must use the editors in the Uniface development environment (the version 9 IDF or the version 10 IDE) to change their source objects, then compile those and make the resulting compiled objects available to the application.

Counter – the oddball

There is one particular type of Global Object that is different from the others: the Counter. Contrary to other Global Objects, Counters are not static run-time objects. Any application can change them through ProcScript. The numset statement creates a counter or re-initializes its value, and the numgen statement increases a counter’s value. Considering this, you may even consider Counters as run-time data rather than run-time objects.

To maintain Counters, Uniface 9 offers the Define Counter dialog. This dialog gives the impression that, like for other Global Objects, it maintains source objects. However, it does not. In fact, there are no source objects for Counters. They only exist as run-time objects, in UOBJ.TEXT. The Define Counter dialog acts directly on those run-time objects.

If your application uses Counters, be aware of these aspects, and apply extra care when deploying UOBJ.TEXT. Also, users of the Define Counter dialog might just accidentally change the value of a Counter that is being used by an application.

Migrating Counters to Uniface 10

Uniface 10 is straightforward: it simply regards Counters as user data. The one and only way to change them is through the ProcScript instructions that are intended for just that purpose: numset and numgen. There is no dialog that can be used to adjust Counter values.

If you already initialize and maintain your application’s Counters purely by means of ProcScript logic, there is no extra effort involved for the migration of Counters to version 10. This logic will continue to work as it did in version 9.

If, on the other hand, you use the IDF’s Define Counter dialog to initialize and maintain your application’s Counters, you will need to act. To achieve the same functionality in version 10, you must implement that logic yourself, using the available ProcScript instructions. Also, you will need to apply the same care as you did in version 9, to make sure that UOBJ.TEXT is properly distributed and/or installed.

This example auto-increments a counter. If it does not exist yet, it creates it and gives it an initial value:

  ; Auto-increment counter

numgen “MYCOUNTER”, 1, “MYLIB”

if ($status == <UPROCERR_COUNTER>)

    ; Counter does not exist.

    ; Create it and give it an initial value of 1001.

    numset “MYCOUNTER”, 1001, “MYLIB”

    if ($status < 0)

      message “Error generating sequence number.”

      rollback “$UUU”

      done

    else

      SEQNO = 1001

      commit “$UUU”

    endif

else

SEQNO = $result

commit “$UUU”

endif

 

Blog: Frank Doodeman
Frank is a Software Architect at Uniface.

Uniface 10.3: the version to go for

In case you’ve missed the summer’s exciting news from Uniface headquarters, Uniface 10.3 has now arrived.

I’ve already been working with this new version for a while, initially using a couple of pre-releases, but then for the past few weeks the live release. This experience has convinced me that Uniface 10, and version 10.3 in particular, is the version the Uniface community has been waiting for. I’m writing this blog post to explain why, and especially to share my experiences with the new IDE.

Background: Uniface 10

Uniface 10 was designed and built based on the wishes of the Uniface developer community. Hundreds of questions and requests from Uniface developers all over the world were taken into account during this extensive design exercise.

The result is a complete overhaul of the Uniface development environment. Uniface 10 has a whole new look and feel, comparable with any modern IDE.

Although it’s still recognizably Uniface, developers may need a little time to get used to the new version, but in my experience that will be time well spent. There’s no way I’m going back to Uniface 9!

A major difference from earlier versions is that Uniface 10 is a non-modal development environment, which means you can work with as many Uniface objects as you like in parallel. Being able to switch between components with just one click makes development easier and more efficient. This by itself is a great reason to start using Uniface 10.

Highlights of Uniface 10

Here are some of the enhancements that you’ll notice immediately when you start using Uniface 10 for the first time:

  • The IDE’s performance has significantly improved, making the non-modal concept a pleasure to work with.
  • The graphical form painter functionality is drastically improved – a strong argument for client/server developers to switch to Uniface 10.
  • Debugging is faster: every error and warning message in the compiler output contains a hyperlink to the relevant line of code.
  • There’s a completely updated and stable meta-dictionary so developers can safely port their existing custom-written utilities to Uniface 10. The additional menu items in previous Uniface versions can be used to launch these utilities.
  • Uniface 10 now also has user-defined menus and user-defined worksheets. My experience shows these are very powerful. Yes, you might need to modify your tools, but again it’s worthwhile.
  • The new Transport Layer Security (TLS) network connector makes the network connection between client and server secure – vital for business-critical applications.

I’ll discuss many of these enhancements in more detail in future posts.

As well as all these major improvements, Uniface 10 brings some smaller “nice to haves”. For example, I’m pleased to have the option to set the title bar text of the IDE application window.

Migrating to Uniface

The migration process from Uniface 9.7 to Uniface 10 has been run by the Uniface team over and over again. Many huge Uniface 9 customer applications have been migrated successfully to Uniface 10. So for those currently on Uniface 9.6 or 9.7, migration is likely to be a smooth process.

If, on the other hand, you are currently considering migrating to Uniface 9.7.05, my advice would be to move directly to Uniface 10 instead because of the advantages described above and (This is also Uniface’s advice) it means one migration rather than two and ensures long-term support.

Conclusion: based on my experience, I believe Uniface 10.3 is the version to go for.

Blog: Peter Lammersma
Peter Lammersma is an entrepreneur and IT and business consultant. Peter works extensively with Uniface 10. As a long-serving member of the Uniface community, he’s kindly agreed to give his independent perspective in this blog series.

Do I need to compile this?

Over the years many Uniface developers have created tools on top of the Uniface Repository.

One tool that has been made by many, is one that looks for “dirty” objects: objects that were modified after they were last compiled successfully.

In Uniface 9 such a tool would have been based on comparing the fields UTIMESTAMP (modification date of the source) and UCOMPSTAMP (compilation date of the source) of various Uniface Repository tables.

In Uniface 10 this has changed, mainly to align the repository with the export format that has been optimized for version management:

  • The modification date of a development object source is only found in the main object. And yes, it is also updated when you change a sub-object. So if you change a modeled field, the UTIMESTAMP of the entity is updated.
  • The compilation date of a development object is no longer an attribute of the source code. It does not have much value to know when the source was last compiled, if you can’t be sure that the compiled object was the result of that compilation. Someone may have copied a different compiled object to the output folder. The only real compilation date is that of the compiled object (file on disk or in a UAR).

Uniface 10.3 is the first release of Uniface 10 that is shipped with meta definitions: the new DICT model is published. So now you can re-engineer the tools that you made for Uniface 9.

In order to make it easier to (re)write a tool, the $ude(“exist”) function has been enhanced to return the attributes of the compiled object (file on disk or in a UAR) such as the modification date.

Compiling objects because their source code has changed

It is not just components that require compilation. There are 14 types of development object that require compilation and generate a file in your resources.  I have attached a sample tool that checks whether these objects are “dirty” and therefore require compilation.

The tool gathers the source modification date from main development objects, and the compilation date of the compiled objects. In some cases, one main development object (such as a message library) results in many compiled objects (messages).

The tool uses $ude(“exist”) to check the compilation timestamp of the compiled object and $ude(“compile”) to compile it.

The attached export file contains a full project export, so when you open project WIZ_COMPILE, you will see the whole thing.

You can download the export here: [download id=”7581″]
And here is a file with some test data for each object type: [download id=”7585″]

You will need the Uniface 10.3 DICT model to compile the tool. The new DICT model for Uniface 10.3 is delivered in umeta.xml in the uniface\misc folder of your development installation.

PLEASE NOTE: The sample tool does NOT take into account that a component may require compilation because a modeled entity or an IncludeScript has changed. See below.

Compiling components because a modeled entity has changed

Please note that the attached sample does NOT check if a component requires compilation because a  modeled entity has changed. If you had this check in your Uniface 9 tooling, you also need to implement it in your new tooling. 

A Uniface 9 based example for this issue can be found here:
http://theunifaceuniverse.blogspot.nl/2011/04/change-entity-compile-forms.html
You would need to simplify this for Uniface 10 because the modification date is only on the main development object.

Compiling objects because an IncludeScript has changed

Please note that the attached sample does NOT check if a development object requires compilation because an IncludeScript has changed.

To implement this is quite tricky, as you would have to find the #INCLUDES within the IncludeScript code, and handle the fact that they can be nested. To correctly parse all of that might not be much faster than just compiling everything…