Machine Learning Authors: Liz McMillan, Janakiram MSV, Roger Strukhoff, Yeshim Deniz, Pat Romanski

Blog Feed Post

Silverlight 4 + RIA Services - Ready for Business: Validating Data


To continue our series let’s look at data validation our business applications. Updating data is great, but when you enable data update you often need to check the data to ensure it is valid.  RIA Services as clean, prescriptive pattern for handling this.   First let’s look at what you get for free.  The value for any field entered has to be valid for the range of that data type.  For example, you never need to write code to ensure someone didn’t type is “forty-two” into a textbox bound to an int field. You also get nice looking and well behaved validation exposure in the UI. 


Note: if you are not seeing this, ensure that “ValidatesOnExceptions=True” is in the binding expression for the each field

Of course, that sort of validation only goes so far, in  real application you need some more extensive validation.  And for this validation you absolutely have to check the validation before your business logic is run because you don’t know what client might be sending you the update, in addition, you want to check the validation on the client to give the user a really nice user experience and reduce the number of error-case hits to your server which reduces server load.   In traditional models, you need do error checking twice to cover both of these cases.  But that is obviously error prone and easy to get out of sync.  So RIA Services offers a common model for validation.  

The most common cases are covered by a set of custom attributes you apply to your model on the server.  These attributes are common across the .NET Framework supported by ASP.NET Dynamic Data, ASP.NET MVC and RIA Services.  You can find the full set in System.ComponentModel.DataAnnotations.    But to give you a flavor:

[Display(Order = 0)]
[Required(ErrorMessage = "Please provide a name")]
public string Name { get; set; }
[Range(0, 999)]
public Nullable<decimal> Price { get; set; }
                    ErrorMessage = "Please use standard Url format")]
public string Uri { get; set; }



As you an see from above the validations on the client are handled automatically, but they are also run again on the server BEFORE your Update method is called.  This allows you to keep the validation gunk out of your business logic for Update.   The other great thing is that these validations will apply in exactly the same way no mater where that entity is used in the UI because they are build into the model.    You can of course localize the error messages by passing a resource ID rather than a hard code string.  You can also read these validation metadata out of an external config file or database rather than using attributes in the code. 

But clearly that doesn’t cover all cases.  Many times you need to write some actual procedural code.  Let’s consider an example of validating the description to ensure it is really complete.  There is no custom attribute for that ;-).  So let’s write a bit of C# code.  First we need to indicate that the Description field has some custom validation. 

public string Description { get; set; }

Then you can easily create this class and implement the IsDescriptionValid method.


  1:         public static ValidationResult IsDescriptionValid(string description)
  2:         {
  3:             if (description != null && description.Split().Length < 5)
  4:             {
  5:                 var vr = new ValidationResult("Valid descriptions must have 5 or more words.",
  6:                     new string[] { "Description" });
  7:                 return vr;
  8:             }
 10:             return ValidationResult.Success;
 11:         }


Notice in line 1, the signature of the method must return a ValidationResult – this is a class from DataAnnotations that contains information about any validation errors.  The method also has to take one parameter that is the same type of the field it is being applied to.  You could do it on the entity level to do cross field validation.  

Next on line 3, i am implementing some very lame algorithm for determining if the description is valid or not. 

On line 5 and 6, I return an error and indicate which field this applies to. 

Now, run the application.  You will see that we can edit the description and tab off with no error, but if we submit, then we get back an error in exactly the same way as we saw before.  Notice I could have sent several entities and each of them could have errors.  RIA Services keeps up with each of them (we even give you a list) and as the user edits each one we show some UI like this. 



Note, if you see this sort of dialog instead:


It likely means you need to write a handler for SubmitChanges on your DomainDataSource. 


<riaControls:DomainDataSource  SubmittedChanges="plateDomainDataSource_SubmittedChanges"


        private void plateDomainDataSource_SubmittedChanges(object sender, SubmittedChangesEventArgs e)
            if (e.HasError &&
                e.EntitiesInError.All(t => t.HasValidationErrors))
                System.Windows.MessageBox.Show(e.Error.ToString(), "Load Error", System.Windows.MessageBoxButton.OK);

Now, this is very cool because we have the full power of .NET to write our validation rules.  But the down side is that I only get validation once the change is submitted.  What I’d really like to do in some cases,  is have some write some custom validation logic and have it execute on the server AND the client.    Luckily we have the power of .NET on the client and the server so we can use shared code validation.  To enable this simply change the name of the file that contains the validation rule to include “.shared.cs” post-fix.  This changes causes the RIA Services MSBuild logic to compile the file for the client and the server. 



Now the exact same code will run on the client and the server.  So if there is a bad description, we no longer have to round-trip to the server to work that out. 


Of course, in a real world case you are likely to have both server-only and shared validation rules and RIA Services fully supports that scenario as well.  Simply define any shared validation rules in .shared.cs files and any server-only validation rules in another file. 

Read the original blog entry...

CloudEXPO Stories
Cloud-Native thinking and Serverless Computing are now the norm in financial services, manufacturing, telco, healthcare, transportation, energy, media, entertainment, retail and other consumer industries, as well as the public sector. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. DevOpsSUMMIT at CloudEXPO expands the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike.
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It's clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we've lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex to learn. This is because Kubernetes is more of a toolset than a ready solution. Hence it’s essential to know when and how to apply the appropriate Kubernetes constructs.
To enable their developers, ensure SLAs and increase IT efficiency, Enterprise IT is moving towards a unified, centralized approach for managing their hybrid infrastructure. As if the journey to the cloud - private and public - was not difficult enough, the need to support modern technologies such as Containers and Serverless applications further complicates matters. This talk covers key patterns and lessons learned from large organizations for architecting your hybrid cloud in a way that: Supports self-service, "public cloud" experience for your developers that's consistent across any infrastructure. Gives Ops peace of mind with automated management of DR, scaling, provisioning, deployments, etc.
xMatters helps enterprises prevent, manage and resolve IT incidents. xMatters industry-leading Service Availability platform prevents IT issues from becoming big business problems. Large enterprises, small workgroups, and innovative DevOps teams rely on its proactive issue resolution service to maintain operational visibility and control in today's highly-fragmented IT environment. xMatters provides toolchain integrations to hundreds of IT management, security and DevOps tools. xMatters is the primary Service Availability platform trusted by leading global companies and innovative challengers including BMC Software, Credit Suisse, Danske Bank, DXC technology, Experian, Intuit, NVIDIA, Sony Network Interactive, ViaSat and Vodafone. xMatters is headquartered in San Ramon, California and has offices worldwide.