Introducing Dimensions CM Git Client and Git Extensions

You know how to use Git and want quickly move to Dimensions CM?

Do you work on multiple projects some of which use Git and some use Dimensions CM?

Do you use an IDE that doesn’t integrate with Dimensions CM?

I have good news for you. Dimensions CM team has recently released Dimensions CM Git Client and Git Client Extensions for the most popular IDEs.

What is Dimensions CM Git Client? It’s a software installed on top of a standard Git, and it allows you to clone sources from Dimensions CM repository and synchronize changes back to Dimensions while still working with local Git repository. Every commit becomes a changeset in Dimensions, you may still use Dimensions requests, reviews and pull requests. Git Client is available on Windows, Linux and MacOS. The download link requires a login, but don’t hesitate to create an account and download – it’s free of charge.

What about Git Client Extensions for IDEs? Git Client extensions simplify clone, provide access to Dimensions requests, reviews and Micro Focus Pulse, and work together with Git Client to provide the best user experience in your favorite IDE. Extensions are available for IntelliJ-IDEA-based IDEs, Eclipse, Visual Studio and VS Code. IntelliJ plugin is available for download from the plugins repository, refer to SupportLine site to find other plugins. They’re also free of charge.

How does it work? It’s easy: Run Git bash, type
> git clone dimensions://stl-ta-vcw12-8/cm_typical@dim12/qlarius/mainline_vs_str/
Cloning into 'mainline_vs_str'...
done.

Now sources from Dimensions are stored in a fully-functional Git repository. The URL is a source code locator in the remote Dimensions repository, its syntax is:

dimensions://server/db_name@db_connection/product/stream

For the first time it’ll ask you to enter credentials to connect to remote server. It is recommended to use Git Credential Manager to store passwords securely.

Let’s do the same in VS Code:

View > Command Palette… > Git: Clone

URL: dimensions://stl-ta-vcw12-8/cm_typical@dim12/qlarius/mainline_vs_str/

Or better:

View > Command Palette… > Dimensions: Clone

Connection string: dimensions://stl-ta-vcw12-8/cm_typical@dim12/qlarius/mainline_vs_str

Now pick stream to clone, then pick path > Select > Open

Voila! The repository is cloned and opened in VS Code

How to synchronize?

It’s the same as you usually do for Git, commit and push:

> git add -A && git commit -m "Making changes"
> git push

If you wish to use a request:

> git add -A && git commit -m "Making changes with a request [qlarius_cr_35]"
> git push

Note the request ID in square brackets [qlarius_cr_35] - this instructs Git client that you want to relate a request to a commit. When the changes are synchronized to Dimensions this commit becomes a changeset with request QLARIUS_CR_35.

 Run git pull to update changes from Dimensions repository. Every Dimensions changeset becomes a commit in your local Git repository.

 From VS Code it’s even simpler:

 Go to View > SCM

 Note that there’s Dimensions Request section > Press Set Default Request > Pick the request from your inbox > The request is added to the commit message

Now review and stage the changes, add a comment and hit Commit. Press Synchronize Changes at the status bar to synchronize with a remote repository.

To open the request in Pulse select:

View > Command Palette… > Dimensions: Open Request

The Pulse request page will be opened in a browser window. You can view request summary, action the request to a new state or update request attributes.

Try out the Dimensions CM Git Client and Git extensions for IDEs. Also check our Documentation Center: for the Git Client and for Git extensions.

 

 

 

 

Continue reading
7 Hits
0 Comments

NEW COURSE: Silk Central Essentials (SLK110-19)

NEW COURSE: Silk Central Essentials (SLK110-19)

Micro Focus Education is pleased to announce the release of its newest offering within the ADM portfolio.

·         Developed with Micro Focus ART
·         Self-paced, with tracking and graded exam
·         Interactive software simulations with voice-over and closed captioning
·         Online resources include exercise scenarios, based on a full workflow from requirements through to reporting

For more information go to the Course Outline.
Access the course itself on Education Central.



Course outline

This introductory course provides students with the skills needed to effectively use the Silk Central software product. This course focuses on teaching the concepts and technologies in Silk Central and the tasks performed by end users. This course covers the following areas: Requirements, Tests, Execution Management, Execution Plans, Test Tracking, Issue Management, Reports, Manual Testing, Projects.

For more information go to the Course Outline.

Access the course itself on Education Central.

Continue reading
74 Hits
0 Comments

Coming Soon: Silk Central advanced configuration and administration

Coming Soon: Silk Central advanced configuration and administration

 

Course Description

To provide advanced and administrative concepts and skills for Silk Central.

 

Audience/Job Roles

This course is for those persons who will administer configure and use Silk Central.

 

Course Objectives

This course provides the concepts and skills needed to:

 

·        Perform and understand Instance Administration, System Administration and Project Administration

·         Understand the access management concepts at a project and cross project level

·         Perform project setup and configuration

·         Create and understand baselines, custom attributes and filters

·         Understand how Issue Management workflows can be configured

·         Describe an overview of the different 3rd party integration categories available

·         Perform an integration with Git and Jenkins

·         Describe the various reporting mechanisms available

·         Describe report template, subscriptions and configure the Dashboard panel

 

 

 

Features

·                 Developed with Micro Focus ART

·                 Self-paced, with tracking and graded exam

·                 Interactive software simulations with voice-over and closed captioning

·                 Online resources include exercise scenarios, based on a a full workflow from requirements through to reporting

Continue reading
430 Hits
0 Comments

Submit a New Item from Chat in SBM 11.6

In SBM 11.6, users can submit work items from within a chat session. 

An administrator must set up and configure ChatOps with SBM in order to use the Chat feature in SBM.  Note that we simplified the process by including ChatOps with SBM in 11.6.  This means an admin can now configure ChatOps on a Windows server running SBM, start the ChatOps services using SBM Configurator, and then finish the set up using Application Administrator. 

Once ChatOps is configured, users can click Open Chat from the Actions menu on a work item. 

Here’s how you submit an item from within a chat session:

Type either:

@sbm-bot submit item

or:

@sbm-bot submit item into projectName

If you decide to provide the project name, you can enter the full name or just a part of the name.  The chat bot will prompt you to select which project to use if there are several matches. 

For example:

After you select the project, you are prompted to complete any required fields.  For selection fields, up to 30 selections are displayed.  If you don’t see the selection you want, try typing it instead. 

After you finish providing the required values, click Submit to confirm:

It’s that easy.  Now you can chat with team members about a work item and submit a new one without even leaving the chat session.  

 

Continue reading
92 Hits
0 Comments

SBM MODSCRIPT, PART 18 - USING RANGE FOR EASY LIST INTERACTION

ChaiScript uses the "Range" object for iterating lists, Vectors, and Maps. Mainly, it does it behind the scenes while you use the for ( rec : list ) syntax. However, a Range can be a very useful way to quickly grab the first or last item in a list:

var firstField = range( rec.Fields() ).front();
var lastField = range( rec.Fields() ).back();

The above code creates a Range which wraps the list returned by rec.Fields().  The range can be used for iterating the list if desired, but the helpful hint here is how to access the first or last item in the list. 

 

SBM ModScript - Table of Contents

Tags:
Continue reading
110 Hits
0 Comments

SBM MODSCRIPT, PART 17 - File Fields

File and URL fields have been added to a recent version of SBM. As such, each field can store one or more entries, either files or URLs. The ModScript interface has not yet been updated to make interaction with these fields simple. However, reading from the fields is not all that difficult, as ModScript has access to the TS_FILE_OBJS table, where all the data is stored. Accessing file contents can be more difficult, as the files can either be on the file system or stored as a blob (this is a system configuration option). Also, interacting with files isn't always easy anyways, as files can contain binary data which would make reading the file or changing the file into questionable use cases. However, I was recently asked how a customer could access file data from a file field, where all files are JSON data. This makes sense, we can do something at the script level with JSON data, so I went down the rabbit hole and came up with the following two options. Keep in mind, if you only require the file names and sizes, there is no need to get the file contents. If you do not need the file contents, simply read the rows from TS_FILE_OBJS and decide what to do with the values.

 

Option 1: Direct access from ModScript

ModScript is fairly powerful, we should be able to simply grab the files, consume their contents, and process the JSON at will. For the most part, we can. The big blocker I ran into is that file attachments, when stored on the file system, can be on a network share and require a certain authenticated user to access them. If you are in this configuration, I'd suggest moving on to Option 2, as I did not find any workaround for network user impersonation in ModScript. Also, this doesn't have to be a network folder, it is possible that it is a local folder that requires specific user permissions. In either case, Option 2 is for you. Option 1 works well with files stored as blobs in the database or files stored with no user authentication requirements.

For direct access, ModScript can read the rows from the TS_FILE_OBJS table. It then iterates the entries, and reads the file contents from the related blob or file. It parses the file contents into a JSON object. At that point, it is ready for whatever JSON processing you wish to do. I simply invoked processJSONObj() from part 14 of this series as an example. 

include("processJSONObj");

def AreAttachmentsStoredInDB() {
  global __AreAttachmentsStoredInDB__; // singleton, init once, access only through this function
  if ( __AreAttachmentsStoredInDB__.is_var_undef() ) {
    var setting = Ext.CreateAppRecord( Ext.TableId("TS_SYSTEMSETTINGS") );
    setting.Read("StoreAttachmentsInDatabase");
    __AreAttachmentsStoredInDB__ = 0 != setting.GetFieldValueInt("LONGVALUE");
  }
  return __AreAttachmentsStoredInDB__;
}

def AttachmentsFileSystemLocation() {
  global __AttachmentsFileSystemLocation__; // singleton, init once, access only through this function
  if ( __AttachmentsFileSystemLocation__.is_var_undef() ) {
    var setting = Ext.CreateAppRecord( Ext.TableId("TS_SYSTEMSETTINGS") );
    setting.Read("WorkCenterAttachDir");
    __AttachmentsFileSystemLocation__ = setting.GetFieldValueString("STRINGVALUE");
  }
  return __AttachmentsFileSystemLocation__;
}

class FileFieldEntry {
  var name;
  var filename;
  var fileSystemName;
  var blobID;
  
  def FileFieldEntry() {
    this.name = "";
    this.filename = "";
    this.fileSystemName = "";
    this.blobID = 0;
  }
}

def GetFileFieldEntries( item, fieldName, outVect ) {
  var fileObjs = Ext.CreateAppRecordList( Ext.TableId("TS_FILE_OBJS") );
  var field = item.Fields().FindField( fieldName );
  fileObjs.ReadWithWhere( "TS_RECORDID=? and TS_TABLEID=? and TS_FIELDID=?",
                          [
                            Pair( DBTypeConstants.INTEGER, item.GetId() ),
                            Pair( DBTypeConstants.INTEGER, item.GetRecTableId() ),
                            Pair( DBTypeConstants.INTEGER, field.GetId() )
                          ] );
  for ( fileObj : fileObjs ) {
    var entry = FileFieldEntry();
    entry.name = fileObj.GetFieldValueString("NAME");
    entry.filename = fileObj.GetFieldValueString("FILENAME");
    entry.fileSystemName = fileObj.GetFieldValueString("CONTENTS");
    entry.blobID = fileObj.GetFieldValueInt64("BLOBID");
    outVect.push_back( entry );
  }
}

def GetFileContents( FileFieldEntry entry ) {
  if ( AreAttachmentsStoredInDB() ) {
    var f = TempFile();
    Shell.Db().WriteBlobToFile( entry.blobID, f.GetFileName() );
    return Ext.ReadTextFile( f.GetFileName() );
  }
  else {
    var path = AttachmentsFileSystemLocation();
    path += '\\';
    path += entry.fileSystemName;
    return Ext.ReadTextFile( path );
  }
}

var fileFieldEntries = [];
GetFileFieldEntries( Shell.Item(), "JSON_FILES", fileFieldEntries );
for ( fileFieldEntry : fileFieldEntries ) {
  var fileContents = GetFileContents( fileFieldEntry );
  Ext.WriteStream( fileContents );
  var json = fileContents.from_json();
  processJSONObj( json );
}

 

Option 2: Access file/url field values via REST call to JSONAPI

In many ways, this option is far simpler than Option 1, as you don't have to do it yourself. All we need to do is take advantage of the JSONAPI "GetFileField" function, which will provide the full file contents for our field. The one part that made this hard was that the JSONAPI uses BASE64 encoding on the file contents, and I needed some way to decode the contents as text so that I could work with it. This is why I wrote Part 16, where I provided a way to decode BASE64 values to text. First, you will need to add a RESTDataSource to your process app. Call it "SBM_JSONAPI" and point it to "http://localhost/workcenter/tmtrack.dll?JSONPage&command=jsonapi&JSON_Func=". It may be helpful to create an Endpoint for this, which allows you to customize the URL in AR for various environments. When ModScript invokes this URL, it needs to be able to get in to SBM AE, so the url may need to be https and could need to point to a specific AE runtime server or load balancer. The point here is that ModScript is invoking SBM AE's REST JSONAPI, so the URL has to help ModScript get there. The authentication type "Security Token" will probably work for handling auth, but this may also need to be fiddled with.

Once you have a RESTDataSource that will be used by ModScript, we can proceed:

include("processJSONObj");

def GetFileFieldEntries( item, fieldName ) {
  var REST = Ext.CreateAppRecord( Ext.TableId( "TS_RESTDATASOURCE" ) );
  REST.Read("SBM_JSONAPI");
  var out = "";
  if ( !REST.Get( out, [ 
                         Pair( "JSON_Func", "GetFileField" ),
                         Pair( "JSON_P1", item.GetRecTableId() ),
                         Pair( "JSON_P2", item.GetId() ),
                         Pair( "JSON_P3", fieldName )
                       ] ) ) {
    Ext.WriteStream( Shell.GetLastErrorMessage() );
    ExitScript();
  }
  return out;
}

add_global_const("ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/", "CONST_BASE64TABLE");
def Base64DecodeAsText( string input ) { // assumes output is valid text (not binary)
  var sOut = "";
   
  var buf = [uint8_t(),uint8_t(), uint8_t(), uint8_t()];
  var encoded = int( input.size() );
  var count = 3 * ( encoded / 4 );
  var i = 0;
  var j = 0;
  
  while ( sOut.size() < count ) {
    // Get the next group of four characters
    //    'xx==' decodes to  8 bits
    //    'xxx=' decodes to 16 bits
    //    'xxxx' decodes to 24 bits
    for_each( buf, fun( entry ){ entry = 0; } ); // zero out buffer
    var stop = min( encoded - i + 1, 4 );
    for ( j = 0; j < stop; ++j ) {
      if ( input[i] == '=' ) {
        // '=' indicates less than 24 bits
        buf[j] = 0;
        --j;
        break;
      }

      // find the index_of inside CONST_BASE64TABLE for our value
      buf[j] = fun( s, c ) {
        for ( var i = 0; i < s.size(); ++i ) {
          if ( s[i] == c ) {
            return i;
          }
        }
        return string_npos;
      }( CONST_BASE64TABLE, input[i] );
      ++i;
    }
	
    // Assign value to output buffer
    sOut += char(buf[0] << 2 | buf[1] >> 4);
    if ( sOut.size() == count || j == 1 ) {
      break;
    }
    
    sOut += char(buf[1] << 4 | buf[2] >> 2);
    if ( sOut.size() == count || j == 2 ) {
      break;
    }
	
    sOut += char(buf[2] << 6 | buf[3]);
  }
  
  return sOut;
}

var fileFieldValue = GetFileFieldEntries( Shell.Item(), "JSON_FILES" );
var fileFieldValueJSON = fileFieldValue.from_json()["fieldFileObj"]["fileObjList"];
for ( fileFieldEntry : fileFieldValueJSON ) {
  var fileContents = Base64DecodeAsText( fileFieldEntry["contentsBase64"]["data"] );
  var json = fileContents.from_json();
  processJSONObj( json );
}

 

SBM ModScript - Table of Contents

Tags:
Continue reading
114 Hits
0 Comments

SBM MODSCRIPT, PART 16 - BASE64DECODE

I recently discovered that I would need a way to do base64 decoding for a ModScript I was writing. This can be tricky, as the output could be a binary value with embedded zeros. You could certainly do this with the output as a Vector with each entry a uint8_t (unsigned byte). However, in my use case, I knew that the data was text and could be represented as a string. As such, I wrote the following:

add_global_const("ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/", "CONST_BASE64TABLE");
def Base64DecodeAsText( string input ) { // assumes output is valid text (not binary)   
  var sOut = "";
   
  var buf = [uint8_t(),uint8_t(), uint8_t(), uint8_t()];
  var encoded = int( input.size() );
  var count = 3 * ( encoded / 4 );
  var i = 0;
  var j = 0;
  
  while ( sOut.size() < count ) {
    // Get the next group of four characters
    //    'xx==' decodes to  8 bits
    //    'xxx=' decodes to 16 bits
    //    'xxxx' decodes to 24 bits
    for_each( buf, fun( entry ){ entry = 0; } ); // zero out buffer
    var stop = min( encoded - i + 1, 4 );
    for ( j = 0; j < stop; ++j ) {
      if ( input[i] == '=' ) {
        // '=' indicates less than 24 bits
        buf[j] = 0;
        --j;
        break;
      }

      // find the index_of inside CONST_BASE64TABLE for our value
      buf[j] = fun( s, c ) {
        for ( var i = 0; i < s.size(); ++i ) {
          if ( s[i] == c ) {
	        return i;
	      }
        }
        return string_npos;
      }( CONST_BASE64TABLE, input[i] );
      ++i;
    }
	
    // Assign value to output buffer
    sOut += char(buf[0] << 2 | buf[1] >> 4);
    if ( sOut.size() == count || j == 1 ) {
      break;
    }
    
    sOut += char(buf[1] << 4 | buf[2] >> 2);
    if ( sOut.size() == count || j == 2 ) {
      break;
    }
	
    sOut += char(buf[2] << 6 | buf[3]);
  }
  
  return sOut;
}

The function above iterates the input string contents and uses base64 to create an decoded output string. Notice that the "buf" variable is a Vector of 4 unsigned, 8 bit integers. As we are going to use bit shifting in order to decode the data, it is important to use unsigned byte data to ensure the expected bit-shift result. We find the index of the character in CONST_BASE64TABLE to find the data-representation we are looking for, then use bit shifting to convert the buf value to text. The result is the original text after processing the base64 algorithm. A possible use case for this might be in decoding HTTP headers from a REST call. 

SBM ModScript - Table of Contents

Tags:
Continue reading
91 Hits
0 Comments

SLAs with SBM Notification Engine

Starting in SBM 11.6, the Service Level Agreement (SLA) feature has been enhanced to use the standard SBM Notification Server engine instead of the SLA engine. This means that you can configure an SLA action using the familiar notification UI, with most of the notification options now available to you.

For example, let’s say you want to notify the current owner of an item when that item is at risk for violating an SLA. To set this up, assume that you’ve already created an SLA and any applicable clauses. Now, add the action for a clause using a When condition such as the following:

(When: 5 minutes before Medium risk)

After you’ve specified the When condition and saved the action, it’s time to create the notification. Click the add icon:

From here, you will notice the options are similar to creating a standard notification. Fill out the form as shown, and then click the + icon to create the condition: 

This is where you designate the current owner of the SLA item to receive a notification. Set the condition as shown:

Owner Is Equal To (Current User)

After saving the condition, you can return to the SLA action and review a summary:

Don’t forget to subscribe users to this new notification. They will now be notified when their items are at risk of violating an SLA.

 

NOTE: If you are upgrading and have existing SLAs, you can switch to the SBM Notification Server engine by removing the current SLA notification rules and then re-adding them. 

 

Continue reading
99 Hits
0 Comments

Custom SLA Reports in SBM 11.6

Starting in SBM 11.6, you can now create and run custom Service Level Agreement (SLA) reports based on Listing, Distribution, Advanced Distribution, and Summary reports.

Let’s check this out.

First, in Application Administrator, I need two new report privileges, Create/Edit/Delete SLA reports and Run SLA reports.

Now, assume that I have the SLA engine running, an SLA defined, and a few items submitted that are subject to the SLA. I’ll create a simple listing report.

Before I do anything else, I need to go to the Additional Options page and select Show SLA Fields. This enables SLA fields to appear in most places in the report definition where other regular fields appear (where applicable).

Now, on the Content page, you can see that the SLA fields appear in the Select Columns to Display list. I’ll select them all and move them over.

 

After that, on the Search Filter page, I’ll specify that just items that are subject to the Test SLA will appear in my report:

And here’s my report:

Let’s create a distribution report now. As with the Listing report, I first select Show SLA Fields and then set my search filter to the Test SLA.

For this report, I want to show how many items are at risk for violating the Test SLA, sorted by type and output as a percentage. So, on the Content page, I set my options as shown below:

Here’s what my distribution report looks like:

Now I can easily see the types of items that are at risk for violating the Test SLA, the total number of items, and the distribution for each category.  

Recent comment in this post
Derek George
Nice to see this as OOTB now instead of requiring a custom reporting solution
Thursday, 06 June 2019 4:04 PM
Continue reading
92 Hits
1 Comment

SBM 11.6 Work Center Search Improvements

Check out the Work Center search improvements we added in SBM 11.6.

Search Filters

You can now save facets that you have selected as a search filter. This lets you save and reuse your facet selections for another search.  To save a search filter, enter your search phrase, select your facets, and then click Apply selections:

Click Save As, and then enter a name for your search filter:

The next time you perform a search, just select your saved filter in the Refined By section instead of applying each individual facet again. 

Search in Sub-Projects

You can now select projects and sub-projects that you want to search in via My Project Items before you perform a search.  This lets you apply a project filter before you start searching for items.  

The selections you make in My Project Items only apply to work item searches.  You still use My Projects in your user profile to save your preferred projects for submitting items, creating feeds, and creating reports.

  

The project selections that you make are saved in your browser's storage.

New Search Facets

You can now use User, Multi-User, Single Selection, Multi-Selection, Single Relational, and Multi-Relational fields as facets on the Work Center search results page for a single application. 

To use these fields as facets, select the Appears on searches for this table option on the field in Composer, and then redeploy the process app.

Other Search Improvements

An X icon has been added to the search field so that you can easily clear a search phrase:

Work Center can now autocomplete a phrase based on your recent search history:

You can enable autocomplete in your Work Center Search settings.

You can use the new Top match option to set a hard limit on the total number of search results that are returned. You can also adjust the Items per page setting to limit the maximum number of search results that are displayed.

You can now show or hide the Refined By section on the Search page:

When you collapse the Refined By section, an icon displays the number of facets that you have applied:

To improve search page performance, you can collapse unused facets:

We hope you find new features these helpful.  Let us know how we can improve your search experience in Work Center!

Recent Comments
Alex Shevchenko
Hi Jeremy, Is it possible to make "Apply selection" button on a sticky navigation bar? (https://www.w3schools.com/howto/tryit.asp?... Read More
Tuesday, 04 June 2019 8:08 AM
Otger Cobben
Hi Jeremy, In 11.5 the search limit is 300. This makes the search for us almost useless. Most of the time we get much more results... Read More
Tuesday, 04 June 2019 1:01 PM
Jeremy Vorndam
Great idea, Alex. I will submit an ENH request for this suggestion. Thanks!
Tuesday, 04 June 2019 5:05 PM
Continue reading
120 Hits
6 Comments

SBM 11.6 Report Improvements

Using Query at Runtime in Advanced SQL

In SBM 11.6,  you can use a Query At Runtime condition in an advanced SQL statement using the following format: 

%QAR{{Table.Column, User text}}%

The Query At Runtime definition starts with the string %QAR{{ and ends with }}%. 

The portion in between consists of two parts:

  • Table.Column – This defines the database table and the column to use.  The table name is optional.  If you leave it out, the table you select in the Report Item Type field is used instead.  Your SQL statement will just be: %QAR{{Column, User text}}%.
  • User text – This is text that you add to the field name on the Query At Runtime report page.  

For example, if your statement is:

@WHERE

TS_DEVELOPER in (%QAR{{UBG_ISSUES.TS_DEVELOPER, is this person}}%) and

TS_OWNER in (%QAR{{UBG_ISSUES.TS_OWNER, is this person}}%)

The Query at Runtime report page will look like this:

Note that the following field types are not supported with advanced SQL Query At Runtime:

  • System fields
  • Multi-Relational, Multi-Selection, Multi-User, and Multi-Group fields

Here are some Query at Runtime examples with supported field types:

  • Last Modified Date field:

@WHERE

UIM_INCIDENTS.TS_LASTMODIFIEDDATE >

%QAR{{ UIM_INCIDENTS.TS_LASTMODIFIEDDATE, is after }}%

  • Owner field:

@WHERE

TS_OWNER in (%{{Current User}}%)

  • Text field:

@WHERE

TS_ISSUEID like '%%QAR{{TS_ISSUEID, contains}}%%'

@WHERE

TS_ISSUEID like '%%QAR{{TS_ISSUEID, contains}}%%' OR

TS_ISSUEID like '%%QAR{{TS_ISSUEID, contains}}%%' OR

TS_ISSUEID like '%%QAR{{TS_ISSUEID, contains}}%%'

  • Date/Time field:

@WHERE

TS_EST_DATE_TO_FIX >= %QAR{{UBG_ISSUES.EST_DATE_TO_FIX, >=}}%

@WHERE TS_EST_DATE_TO_FIX > %QAR{{UBG_ISSUES.TS_EST_DATE_TO_FIX, >}}% and

TS_EST_DATE_TO_FIX < %QAR{{UBG_ISSUES.TS_EST_DATE_TO_FIX, <}}%

  • Binary field:

@WHERE

TS_P4STATUS = %QAR{{UBG_ISSUES.P4STATUS, =}}%

  • User field:

@WHERE

TS_DEVELOPER in (%QAR{{UBG_ISSUES.TS_DEVELOPER, in}}%) and

TS_OWNER in (%QAR{{UBG_ISSUES.TS_OWNER, in}}%)

  • Numeric floating point field:

@WHERE

TS_FLD_NUM_2_FP > %QAR{{UBG_ISSUES.TS_FLD_NUM_2_FP, >}}%

  • Numeric integer field:

@WHERE

TS_FLD_NUM_1_INT > %QAR{{UBG_ISSUES.TS_FLD_NUM_1_INT, >}}%

  • Single-Relational field:

@WHERE

TS_FLD_SINGLE_RELATIONAL in (%QAR{{TS_FLD_SINGLE_RELATIONAL, in}}%)

Using Query at Runtime in Advanced SQL

In SBM 11.6, you can combine basic and advanced SQL conditions in your report definition.  This takes some of the hassle out of writing the advanced SQL statement.  You can also add basic conditions that use Query at Runtime parameters and use them with advanced SQL conditions.

 

For example, to find active items owned by Joe that have a file attachment, define part of the search using basic conditions, and then use pass-through SQL for the remainder:

(In this example, 16 represents a file attachment.)

Other Report Improvements

Make sure to check out the following additional improvements in this release:

  • You can now sort rows in Distribution reports by count. You can order rows by name, total value, or maximum value, and then sort in ascending or descending order.
  • You can now choose to display percentages instead of the number of items in Distribution reports.
  • A new grid-based HTML template (gridlist.htm) is available for Listing reports. This template includes a static header section, which keeps the report title, column titles, and Actions drop-down list always visible. It also provides automatic pagination.

 

 

Continue reading
111 Hits
0 Comments

Parallel Development Using Dimensions CM

Streams

The recommended way of doing the parallel development in Dimensions CM is to use streams. Streams represent branches of development. They contain files and folders, keep track of their revisions, history and pedigree.

Different teams may use separate streams for parallel development. A stream may be created based on a mainline stream or project and merged to it later.

You can work with streams using the command line, in the desktop and web clients, the Windows Explorer or any of the IDE integrations.

Topic Streams

Topic streams are private development branches by default only visible to the originator. With topic streams you can:

  • Isolate work from an existing public stream
  • Hide the changes until they are ready to be merged back into a public stream
  • Create a backup of your local changes in the repository when you need to switch to other task. This is called shelving
  • Use pull requests to automatically integrate changes to the parent stream

 Projects

Projects may also represent a branch of development. Unlike streams they employ the exclusive lock model, where developers request write access for a file before making changes. Projects are more suitable for: managing non-software assets or large binary files, work in a regulated environment or for using remote replication.

Different people may need to work on the same files for different purposes. In this case, projects use version branches to separate concurrent revisions of the same file, and those revisions need to be consolidated (merged) on a file level.

Stream Relationships

  • Items. The stream contains a list of item revisions, it owns items
  •           Requests. By default any created request is related to a stream, but it may be not related to a stream. The requests may be listed in a stream inbox or catalog view and used for delivering changes. A stream may be configured to always require a request for delivering changes
  •           Other streams/projects. The stream may be based on another stream or a project or be parent to a child stream.
  •           Baselines. Baselines are snapshots of a stream state at a particular time. Baselines are created based on a stream or may be used by a stream.

 Parallel Development

Streams enable interactive update-modify-deliver-merge process for developers.

  •          Use Update to get content from the remote repository to a local work area
  •          Make local changes, build and test
  •          Use Deliver to commit changes to a repository
    •         Streams maintain the single line of descent for item revisions, this ensures that files in the repository don’t have revisions with conflicting content
    •         Developers don’t obtain an exclusive lock for files, and that allows to work on the same files in parallel
  •          Use Merge to reintegrate the changes between the development branches. 
    •      Interactive merge allows to resolve any conflicting content
    •      Pull requests automate merging from a topic stream to a parent stream

 

 

 

Tags:
Continue reading
173 Hits
0 Comments

COMING SOON: Silk Central Essentials

COMING SOON: Silk Central Essentials

Course Description

To provide foundational knowledge and skills for Silk Central.

 

Audience/Job Roles

This course is for those persons who will configure and use Silk Central.

 

Course Objectives

This course provides the concepts and skills needed to:

 

·         Understand Silk Central overview

·         Understand requirements and how to create them

·         Describe how to create and manage tests

·         Create and understand execution plans

·         Understand how manual tests can be created

·         Create and implement a manual test

·         Use Issue Manager and how it is used for creating and tracking test issues

·         Understand mechanisms for tracking and reporting

·         Perform basic system administration

 

 

 

 

Features

 

·         Developed with Micro Focus ART

·         Self-paced, with tracking and graded exam

·         Interactive software simulations with voice-over and closed captioning

·         Online resources include exercise scenarios, based on a a full workflow from requirements through to reporting

Continue reading
670 Hits
0 Comments

What's new in Visual Studio Integration in 14.5

Topic Streams and Pull Requests

Topic Streams are great when you want to make isolated changes, experiment and re-integrate them later into your mainline stream. They work together with Pull Requests – a type of review that allows to evaluate a set of changes and orchestrate their automatic integration into the target stream.

From Visual Studio integration you can create a topic stream with New > Topic Stream command:

When creating a new topic stream you can choose an option to reuse your existing solution work area by rehoming it. Rehome will convert your existing work area and align it with the new stream, it is faster than fetching to a clean work area.

After the topic stream is created you can access its pull request from the Dimensions Explorer:

Using the pull request you can review and approve the changes made in a topic stream. Depending on Dimensions Pulse settings, an automatic merge to the target stream may happen.

Pull requests are also accessible from the Reviews panel:

View enhancements

Baselines panel now has the same design as the requests panel:

Streams and Projects panel also updated, now it supports search and can show recent and favorite streams or projects:

 

Themes support for request properties:

Reorganized main and context menu commands for better usability, reviewed and updated toolbar and context menu icons:

 

 

Tags:
cm
Continue reading
999 Hits
0 Comments

How to view a ModScript log file without using remote desktop connection

ModScript, at least since version 11.3, added a new "Log" object with methods to allow sending text to a log file in the SBM "Application Engine\Log" directory.  To view these messages you need to start a "Remote Desktop Connection", aka "Remote Console" or RDP to the AE server and view the log file in the "Application Engine\log" directory.  Previously, this kind of logging was done using the "Ext.LogInfoMsg" call which sent the message to the AE server's Windows Application Event log.  To view these log messages, you also had to "remote" into the AE server, then use the Windows Event Viewer to view the Application Event log.  That level of access can make some clients or server administrators nervous.  In some conditions, the developer may not have any access to SBM servers.

Fortunately, there's a way to create a log that doesn't require any remote access to the AE server, outside of IIS.

Some basic "one time" setup is necessary to allow this capability. Have someone with "administrator" privileges on the SBM AE server do the following steps:

  1. "remote" into the SBM AE server
  2. "CD" to IIS's "inetpub\wwwroot" directory.
  3. Create a subdirectory under "inetpub\wwwroot" for the ModScript log files.  In this example the name of the subdir is "AE-Logs".
  4. Use the Microsoft Sys Internals "junction" tool "junction" to create a file junction from "inetpub\wwwroot\AE-Logs" that is targeted at a subdirectory (which the "junction" command will create) under the App Engine's "Log" directory.  In this example the name of the subdir under "Application Engine\Log" is called "inetpub_AE-logs".

junction  "C:\inetpub\wwwroot\AE-logs"   "F:\Program Files\Serena\SBM\Application Engine\Log\inetpub_AE-logs"

Now your ModScript can use the following calls to create a Log file in that new subdir.  Make sure that the name of the log file you create has a file extension that IIS will handle, like ".txt" or ".htm".  On my system, IIS will not handle a file with the ".log" extension.  I can probably change that by making changes to the IIS settings.

var my_IIS_log Log() ;
my_IIS_log.Open("inetpub_AE-logs/ModScript_log.txt");
my_IIS_log.SetWantTimeStamp(true);
my_IIS_log.Message( LogLevelConstants.AVERAGE, "0 param call to my_IIS_log.Message" );
my_IIS_log.Message( LogLevelConstants.AVERAGE, "3 param call to my_IIS_log.Message : Reporting Level={0} : File={1} : IsOpen={2}" , my_IIS_log.GetReportingLevel() , my_IIS_log.GetFileName() , my_IIS_log.IsOpen() );

 

View the log in a browser:

https://-MY-AE-SERVER-/AE-logs/ModScript_log.txt

Continue reading
404 Hits

What’s new in Visual Studio integration in 14.4

In 14.4 release Visual Studio integration has seen a lot of improvement. The focus of the changes was improving usability, discoverability and modernizing the look.

Redesigned Dimensions Explorer

We’ve made significant investment into updating the look and usability of Dimensions Explorer, previously known as Serena Explorer. It’s a single place where you can read current solution state, access views and operations. It eliminates ambiguity and clearly displays current solution context.

Often used as a starting point to begin work, Dimensions Explorer provides hints and informs about next steps:

    

Themes Support

Themes are available in Visual Studio 2012 and newer, and Visual Studio integration views now react to theme changes on the fly.

Light theme

Blue theme

Dark theme

Integrated Peer Review Process

With Dimensions CM Pulse, peer review process is very easy. It is a powerful tool to review changes, make comments, view code annotations and collaborate. It is accessible from Visual Studio Integration with Reviews panel. You can switch between different display modes and work with reviews inside the IDE.

 

Reviews panel

Doing code review: