A practical look at what’s involved in converting your Visual FoxPro (VFP) applications to Visual Studio and SQL Server.
Let me say up front that I am a long-time FoxPro developer and that I love VFP. I also love .NET and SQL Server and I’ve headed up and participated in many conversions. Most of the conversions I’ve worked on were not driven by technical necessity, but by customer demand that software be built with .NET and SQL Server. Whatever the reason, conversion from VFP to .NET is a significant undertaking.
If you are currently involved in a conversion effort or are planning a conversion effort, then this article is for you. If you’re new to .NET or SQL Server, this article can help you look at those technologies from a VFP perspective.
There is no point-and-click wizard that converts Visual FoxPro applications to .NET.
A solid plan for conversion consists of the following:
- Document your existing system.
- Understand the separate conversion functions required.
- Evaluate the best strategy for converting your application.
- Discover the level of effort required to tackle each function.
- Document the plan of attack.
- Implement.
In this article, I’ll focus on the most typical example of a Microsoft-centric conversion to SQL Server for a data store and a C# or Visual Basic code base. However you can apply this process to a conversion to other databases and languages. In addition, I’ll address specific techniques for converting VFP to .NET in the Implementation section so that you’ll know how to get started with the hands-on work.
Document Your Existing System
Before you begin any development work, you’ll document the existing application in order to provide metrics, a basis for your estimates, and a roadmap for the process. Documenting the existing application can range from a list of how many PRGs, SCXs, and FRXs you have to sophisticated metrics that attach weighted values to every part of your application. You can find a free tool written in VFP8 at VFPConversion.com (see sidebar, Tools for Conversion) that will scan a PJX (project) file and output some measurements for the project. This tool provides a good starting point for documenting an existing VFP application.
Understanding where you are today will be a key step in a successful conversion effort. Assessment documents give the non-technical members of the team, including management, an understanding of the size and scope of the project. They also serve as a common non-technical description of the work to be done.
Understand the Separate Conversion Functions Required
The functions required to convert an application vary from project to project based on the goals you have for the conversion. At one end of the spectrum you’ll find simply converting an existing VFP application to use SQL Server instead of DBFs to store data. On the other end of the spectrum you’ll see a complete rewrite of an application to change its basic architecture, give it a face lift, and use the knowledge gained with the old application to create a completely new version. In this article, I’ll discuss the most typical conversion project-producing a .NET version of an existing VFP application with current functionality.
The possible conversion functions follow:
- Upsize DBFs to SQL Server.
- Tune SQL Server.
- Convert an existing application to use a SQL Server back end.
- Convert visual aspect of forms.
- Convert data environments of forms.
- Convert form code.
- Convert visual aspect of reports.
- Convert data environments of reports.
- Convert expressions and code in reports.
- Convert method code and PRGs.
- Convert functionality provided by ActiveX controls and FLLs.
- Convert visual aspect of menus and toolbars.
- Convert menus and toolbar code.
- Handle special issues-unique programming challenges.
Evaluate the Best Strategy for Converting Your Application
If the existing application uses DBFs to store data, you must decide whether to modify it to work with a SQL Server back end prior to converting any code to .NET. This approach works well when the support staff is not already familiar with SQL Server and when there is sufficient time in the timeline, because it allows the staff to become familiar with administering SQL Server and allows time for fine tuning the database prior to switching to an entirely new code base. In multi-application conversions, having the database converted and functional up front allows for an incremental conversion. Unfortunately, when you take this approach you’ll do some work on the existing code base and you’ll have to toss that work aside when moving to the new code base.
You can convert systems that are currently built as several individual applications or modules accessing a SQL Server back end in an incremental fashion, one application or module at a time. If the existing applications aren’t already divided neatly into functional areas, do that now so that you can turn on functionality in the new application and turn off functionality in the old application in stages, giving you a smooth transition path to the new application.
Much has been written about using interop between VFP and .NET in a conversion, so I won’t try to cover all of that here. Essentially, if you’ve got VFP COM objects or VFP Web services, .NET can easily consume them. Likewise, VFP can consume .NET Web services and DLLs exposed as COM objects. In a new twist, Microsoft recently released the Interop Forms Toolkit 1.0 as a power pack for Visual Basic 2005 which makes it easy to expose .NET forms written in Visual Basic as COM objects. While Microsoft developed this toolkit to allow Visual Basic 6.0 applications to run .NET forms, it works with any environment that can use COM, including VFP. I’ve tried the toolkit and have successfully run .NET forms within my VFP application. With a little work, you can even save VFP cursors as XML, pass the XML to a .NET form, and reconstitute it as a .NET DataTable. You can also use your DBFs in your .NET applications-especially with the new DDEX provider shipping as part of Sedna.
Initially, you’ll do a little work on each of the areas needing conversion so that you can discover the level of effort required for the conversion as described in the next section. However after the initial phase, you will divide the work into classifications described earlier in “Understand the Separate Conversion Functions Required”. The most successful order for performing the conversion has historically been to convert the main program and window for the application, and then convert the visual aspects such as forms, menus, and reports. This gives you a complete skeleton of the application, though not a fully functional application. Once you have a skeleton in place, convert the data aspects of the individual elements, followed by the method code, resolving special issues, testing, and Quality Control (QC). In some instances, developers will have more success converting all aspects of each form, report, and menu before moving on to the next. In other cases, specialization of skills makes the process flow better when you carry out each conversion process separately from the others. Making a good decision for your situation depends on knowing the strengths of your team.
Discover the Level of Effort Required to Tackle Each Conversion Function
In order to accurately determine the level of effort required in a conversion project, you must complete samples of each type of conversion task and measure the time it takes to complete them. Just like making pancakes, you will generally throw out the very first effort as non-representative since it will involve a lot of discovery. The very first form you convert will probably take a fairly long time if you’ve never done it before. Measuring the time it takes to convert the second and third forms, however, can give you valuable information about how long it will take to convert the next 200 forms. Even though developers will get better at conversion tasks and will convert the 200th form much more quickly than the 4th form, they will also run into what I call “special issues.” Special issues are problems unique to one or two of your forms. On average, the increase in efficiency you achieve from repetition will be offset by special issues that must be tackled so these early measurements will turn out to be more accurate than you might suspect.
Document the Plan of Attack
Armed with metrics about the size and complexity of your applications, a strategy for doing the conversion, and estimates that give you a level of effort expected, you’ll be ready for the next steps: document the plan, develop timelines, do resource planning, and create guidelines for implementation. The larger the application, the larger the conversion team, the more valuable a documented strategy becomes. But even on small projects, it’s important that all team members, including management, understand what to expect.
Implement
Database Upsizing to SQL Server
You’ll find two common paths to take in upsizing DBFs (with or without DBCs) to a SQL Server database. The first is to use the VFP SQL Server upsizing Wizard in a one-time process. The wizard has proven effective for the initial conversion of small to medium sized databases. Once the structure of the database resides in SQL Server you will tweak and maintain it there. The data will very likely be imported into the new SQL Server database structure many times after the initial upsizing either via the “import” function of SQL Server Management Studio or through the second method of upsizing-custom SQL Server Integration Services packages (formerly known as DTS packages).
You can convert systems that are currently built as several individual applications or modules accessing a SQL Server back end in an incremental fashion, one application or module at a time.
In complex upsizing scenarios you can use SQL Server Integration Services (SSIS) to map the VFP data into an entirely new SQL Server database structure and it can perform complex conversions on the data during the import. I want to mention for those not targeting SQL Server as their new back-end data store, that SSIS does not require either the source or the target for the data migration package to be SQL Server. You can use SSIS, for example, as an effective tool to upsize VFP or Microsoft Access databases to Oracle or Informix-or even the other way around!
Along with any transformations to the new database, you will normally practice the importing of data several times to insure the process is bullet proof when it comes time to convert the live database. You’ll also find the importing process valuable for testing and QC purposes. VFP reports run against a certain VFP DataSet should match exactly those run against the SQL Server version of the same DataSet. In order to accomplish this, testers often have their own copies of the VFP and SQL Servers versions of specific DataSets.
More often than you think, you will have to tweak the SQL Server database structure as the conversion process progresses. After you make these changes, you can re-run the import routines to create clean copies of the development and test databases and to verify that the import process is still valid.
When working with large databases, you will create and use smaller subsets of the production data to facilitate both the developers and testers and you will do full-sized conversions to accommodate performance testing.
VFP Application Conversion for SQL Back End
VFP supports two ways to talk to SQL Server, SQL pass-through and Remote Views. SQL pass-through is both faster and more flexible, but it’s also harder to convert in an automated fashion and it doesn’t support binary data. Still, in most cases SQL pass-through will be your go-to technology for the majority of your application.
SQL pass-through does not, by default, support pushing updates made to the local cursor back to the server. Fortunately, you can push updates back to the server in code. This approach emulates VFP’s native ability to support inline SQL language syntax in your code. Instead of executing the SQL directly as a command, you’ll make a function call with the SQL statement and you will use parameters to combat SQL injection attacks. On a typical editing form, you might use SQL pass-through to retrieve a cursor that you will bind to a combo box, another to calculate some values for display, and another to retrieve the record that you’ll be editing. Only the last cursor needs to be updatable, so you’ll run this code against the cursor so that a simple TableUpdate() will push the changes back to SQL Server.
When the table you’re working with contains binary data, remote views are your only choice. When used in a data environment, remote views are also relatively easy to convert automatically to .NET. But remote views are relatively slow and they are statically defined and must reside in a DBC. Though you can create remote views programmatically, the fact that they must reside in a DBC means that excessive creation and destruction of remote view definitions will result in bloating of the database’s memo file and the process will further decrease performance.
Database Tuning
VFP programmers have a tremendous amount of knowledge of SQL databases. However, SQL Server is not exactly like VFP and there is a learning curve associated with it. Some of the main issues you’ll encounter in a conversion are the differences in column types and the differences in indexes. For the most part, SQL Server has many more column type choices than VFP. For instance, if you’re storing the value for a 3-option radio button in a DBF, you can choose either Numeric(1,0) or Integer. In SQL Server, you’ll want to choose a TinyInt (numeric type that can hold values from 0-255). The one notable exception to SQL having more choices of data types is VFP’s Date type which has no direct equivalent in SQL Server. SQL Server supports DateTime and SmallDateTime, but you must always store the time portion-even if you don’t want to use it. Even more strange is that SQL Server does not have a convenient way to strip the time portion from a DateTime value so that it can easily be used as a date. Luckily, VFP handles that by mapping SQL DateTime columns to VFP Date columns, but when coding for .NET you’ll have to accommodate this difference.
Indexes in SQL Server bear little resemblance to those in VFP, though they have the same intent. SQL Server indexes cannot be based on expressions-only columns. This isn’t as bad as it sounds because the most common expressions used in VFP indexes are UPPER() and DELETED(). By default, SQL Server’s use of indexes is case-insensitive, and there is no concept of a deleted record. When you think that you never access an index directly in SQL Server (you can’t in fact), it’s actually a much easier system to work with. In SQL Server, indexes are made up of one or more columns and indexes can be primary keys, unique (candidate in VFP-speak), or regular indexes.
I can’t even begin to tell you all you need to know about tuning indexes in SQL Server, but covered indexes are a good place to start. Covered indexes contain all of the columns used in a query. They’re much faster than non-covered indexes because instead of using the index to determine which records are qualified, looking up the records and returning the data, SQL Server can return all of the data straight from the index, without ever looking up the underlying records.
You should also know about clustered indexes. By default, the SQL Server Management Studio UI makes any primary key that you create with it a clustered index. In many (if not most) cases, this is not a good idea. A clustered index isn’t really an index at all. What it really does is specify that as records are added to a table or modified, they are to be physically sorted. Since you’re specifying a physical sort, you can have a maximum of one clustered index per table (you can’t physically sort the same table two ways at once). Going back to the discussion about covered indexes and how SQL Server no longer has to look up the base records in order to return data for a query, think of clustered indexes as an improved version of covered indexes. Since the records are physically sorted, you have access to every single column in a selected record without doing a second lookup. In addition, the records are physically located next to one another on disk, so operations working on consecutive records are very fast. Clustered indexes are very powerful if used correctly-but horrible if used incorrectly. Imagine setting a primary key to a Uniqueidentifier (GUID) column and making it a clustered index. Every insert on the table will cause a re-sort on disk-YUCK! When in doubt, do NOT use a clustered index unless you are absolutely sure about it.
A good rule of thumb for those new to SQL Server indexing is to begin by adding a non-clustered primary key to every table. As performance issues arise, add new indexes judiciously to alleviate the issues.
Visual Conversion
While often thought of as the “easy part” of the conversion because it doesn’t involve coding and algorithms, visual conversion from VFP to .NET often takes more of the time in a conversion project than any other task and usually comprises a large portion of the cost of the project.
If your goals in the conversion are to change the look and feel or the core architecture and workflow of the application, you’ll be re-creating your forms from scratch. You may choose to do the same with reports. However, in the majority of cases tools can do the visual conversion of forms and reports in a matter of minutes. Then you can plug converted forms and reports into the .NET project to create a non-functioning skeleton. Like the framing of a house, you get a great feeling of satisfaction because a great deal of change is readily apparent with comparatively little effort. With the skeleton of the forms and reports in place, you can add functionality to the application in an incremental fashion as you make the forms and reports, one-by-one.
Data Environment Conversion
One of VFP’s key strengths is that data manipulation is baked right into the language. Unfortunately, this removes some of the structure imposed on other languages and makes an automated approach to data environment conversion problematic. If your application uses private data sessions in conjunction with local or remote views in all of its forms and reports, it is likely that you’ll be successful in converting your data environments automatically using a tool or a custom utility. However, if your data environments are built on the fly using the language in various places throughout your forms, you will have a more manual process on your hands.
.NET treats data in an object-oriented fashion. The languages do not have the concept of a data environment the way that VFP does. Instead, forms hold references to DataSets and/or DataTables. A DataSet is somewhat analogous to a data environment in VFP because it contains tables, but there are significant differences. For instance, you can easily have more than one DataSet in use by a .NET form while it’s not easy to use multiple private data sessions in forms in VFP.
.NET also has a MUCH more limited ability to manipulate data than VFP. The syntax is cumbersome, relying on DataSet and DataTable objects (which are part of ADO.NET) to handle data manipulation. Code such as the following in C# shows you how ugly it can be:
int someValue =
(int)myDataSet.Tables["MyTable"].Rows
[currentRow] ["SomeColumn"];
Many good .NET developers create classes just to interact with the data, a process called Object Relational Mapping (ORM), allowing them to work with strongly typed properties of a class and have the class deal with the details of the DataTables and DataSets. The strongly typed DataSets in .NET are simple examples of this strategy for accessing data and most .NET frameworks and solution platforms include some implementation as well.
Some of this is set to change in the next version of Visual Studio when both the Visual Basic and C# compilers will support Language Integrated Query (LINQ). You’ll find LINQ even more powerful than the data manipulation features in VFP, but it will still be a version 1.0 product, so it’s likely to be lacking in several areas as well. Still, in the coming years LINQ may blow the lid off of what VFP developers have always enjoyed over our fellow data-challenged non-VFP developers.
Code Conversion
You’ll quickly find that code conversion is one of the most difficult parts of the conversion process. It requires the most skill in both VFP and .NET. You should think of this process more as a translation process than a conversion process-like translating a book from one language to another. It’s not a matter of translating each word; it’s a matter of translating the meaning.
In most cases SQL pass-through will be your go-to technology for the majority of your application.
To some extent you can automate the conversion of the control structure of the code. Just about every language has a counterpart for a control construct in every other language-or a way to emulate it. For instance, a DO CASE in VFP translates very closely to if () … else if () in C#, but only in some cases does it map to the switch() statement, which at first glance looks like the most similar construct.
You can also convert expressions in an automated way. For example, libraries exist that will let you run an expression such as the following directly in .NET:
TRANSFORM(DOW(DATE()+1))
You can also find utilities that will convert that expression to C# or Visual Basic equivalent source code (see the Tools for Conversion sidebar).
Translating entire blocks of code to produce the same results in another language will require the most attention. If, for instance, you’ve used a third-party library to incorporate TCP capabilities into your VFP application, you’ll be pleased to know that TCP support is now native in the .NET Framework and easy to use at that. Unfortunately, you will now have to rewrite your TCP functionality. If you’re converting data-specific functionality, you might find yourself converting a SCAN loop into a foreach() working against the rows of a DataTable-or you may find that rewriting the code as a stored procedure in SQL Server is the best approach.
Exception handling is another area that will require attention. VFP has default error handlers, Error() methods, ON ERROR statements, and TRY/CATCH blocks. C# and Visual Basic only have TRY/CATCH blocks.
Still, in a well-constructed VFP application, the blocks of code will be small and discreet and will lend themselves to incremental conversion, one method at a time. As the conversion takes place, the visual walking skeleton of the application becomes functional.
Special Issues (A/X, FLL, OCX, Frameworks)
.NET does support ActiveX controls and COM objects through wrappers that encapsulate the unmanaged code; however, in most cases you will find that the control’s authors also make a managed .NET version that you can use instead. If you find a managed .NET version of the control, USE IT! If you must use an unmanaged control, you’ll find that .NET supports them MUCH better than VFP. No messing with AutoYield settings or using timers to get around weird UI quirks.
FLLs are not supported in .NET, but in recent years, FLLs are becoming scarcer in VFP apps. Functionality found in VFP FLLs such as JKEY’s incremental search for grids is baked right into .NET. In fact, .NET grids support advanced features such as sorting by any column in ascending or descending order right out of the box.
Some of the companies that built frameworks for VFP have also built .NET frameworks-OakLeaf’s Mere Mortals framework for example, though the frameworks are not directly equivalent and there is no official upgrade path. In some cases like Visual Extend, the “framework” produces native VFP code and doesn’t add components to the project. In almost all cases you won’t find a conversion for your VFP framework and will have to code accordingly. Frameworks are not nearly as prevalent in the .NET world though the case for them is still compelling.
If you want to incorporate a .NET framework such as StrataFrame or Milos (part of the Milos Solution Platform) into your converted application, it will become part of your conversion effort at every level and you’ll find it well worth the effort.
Testing and QC
The testing process begins on the first day of your conversion and continues after you write the last line of code. Unlike creating a brand-new application, conversion projects have the advantage that you already know how the system should perform. If the new functionality matches the old functionality, it’s correct. If the new report comes out exactly the same as the old report, it’s correct. There is not the same level of work involved as there is developing a new system and gathering and refining requirements.
.NET code has a lot more support for testing than VFP. If you’re not familiar with the concepts of unit testing, look into NUnit or the unit testing built into the Team System versions of Visual Studio. You have a unique opportunity to incorporate testing into your application during the conversion process, because you will test to ensure that the .NET version of your code performs at least as well as the VFP version. Capturing these tests will not only help you in your conversion effort, but will give you invaluable tools for testing changes you make to your application later in its life. How many times have you been afraid to make a change to your application because you didn’t know the consequences? Having a battery of tests available helps you to answer that question with confidence.
In addition to unit testing, converting an application will mean you do a lot of regression testing. Regression testing is more of an end-to-end user experience test than a unit test. For example, while unit testing can validate the tax calculation for an invoice, it can’t test the user experience or validate that the newly calculated tax rate gets printed on the invoice correctly when the sale is over. Generally after you’ve converted each form, report, and process, you’ll pass it to the testers to “bang on.” As you convert entire groups of items and functions, your team will test them again as units. Finally, when the entire application is ready, it is tested again to see how it works as a whole.
Testing requires frequent conversions of the database as well as new versions of the code. In comparison to new product development, testing of conversion projects is faster and easier, but it is also more critical because the new code is designed to replace a mature application and the bugs that are expected in new software are not tolerated nearly as much in converted applications.
Conclusion
There is no point-and-click wizard that converts Visual FoxPro applications to .NET. VFP programmers who remember the wizards that converted FoxPro DOS and FoxPro Win applications to Visual FoxPro and Visual Basic programmers who have tried the tools and wizards for converting VB6 apps to Visual Basic .NET will tell you that automatic conversion of complex systems from one paradigm to another (let alone one language to another) doesn’t often work out as you hope. Still, conversion projects happen, tool vendors write tools that make the challenge a little less daunting, and you will have the experience of those who have gone before you. The task is neither trivial nor impossible. It’s another challenge-an opportunity to grow and learn.