Technology

Linux progress and HRU (Ham Radio University)

I installed the Code::Blocks IDE after reading some reviews and some investigation.  It appears to be a nice IDE for the LINUX environment.  A little tricky to get working and configuring the setting properly but that’s part of the challenge when working in a new environment. I got my test program compiled and working under.  So far so good.

I also discovered that the Microsoft mouse I was using does not work well with VNC as it did not handle the Scroll Wheel events properly so I swapped ti it out with a Lenovo mouse I had lying around. It fixed the scroll wheel not working issue. Way to go Microsoft!

Now on to build some apps and get GNU Radio working.

I was at HRU (Ham Radio University) over the weekend and had a great time with a bunch of friends.  Picked up a few new ideas to investigate.

xUbuntu

I installed xUbuntu on my laptop and my 2950 server, added xrdp and it seems good to go.  One small annoyance is that the scroll wheel on the mouse does not work in the remote desktop.  After doing some research it appears that it was supposed to be corrected after the v 12 or v14 release of Ubuntu,  I can tell you it is not fixed.  If anyone has any ideas as to why it does not work, please contact me.

Now to add development and support programs to see how they work out.

I’m looking for a good dev environment, if anyone has any suggestions I am eager to hear what your recommendations are.

Foray into Linux

It’s been a while since I used any Unix operating system. I’ve spent the last decade or so managing and writing code for Microsoft platforms. To the tune of about 1.5 million lines of code in the asp.net platform and about 150,000 lines of Sql Server stored procedures. To say the least it’s been a lot.

In a previous life I developed applications that had to exist on the NT platform, Sun Unix (real Unix), HP Unix, Data General Unix, RS6000 and even the NEXT operating system (what evolved into OS10). Prior to that UI wrote a Financial Trading system using the QNX OS. QNX for those who are not familiar is a Unix like os that has migrated into a specialty field as this is the OS the Toyota uses in their “Smart” cars to control the cameras etc in the car. There is a nice api which I hope to investigate if I get one the Toyota’s that uses it. You can become an app developer for it and deliver your apps to the cars.
There are several reasons for getting back to Unix at this time.
1. To see if I can do everything I need to do on a Linux laptop.
2. I recently purchased a Lime Micro Systems Software Defined Radio board and the utilities are all written for Unbutu Linux.
3. I want to make a cell phone that runs the Android operating system and incorporates the features of the Lime Micro Chip Set.

I purchased a Raspberry Pi a couple of years ago and created a Green House controller with the pie. It worked out nicely and turned fans on and off, monitored temperature and humidity and opened vents as needed. The Pi uses the Debian OS version which did most of what I was looking to do with it at the time.
The key features I want in an OS are
1. Remote desktop – a must
2. Good GUI windowing system – also a must
3. Easy to set up
4. Easy to use
For my test environment I have a couple of Dell D630 laptops, old but still functional, a couple of DELL 2950 servers. My intention is to seed the lap top with Linux and have a mirror on the 2610, Hence the requirement for remote desktop. All the systems I have and use daily I have remote access to so no matter where I am (in the world) as long as I have an internet connection I can get work done. This is a key factor and one I have been doing since the mid 1990’s without issue.

What I have done do far.
1. I stated out using Debian on the laptop and the server. Installed it about 3 or 4 times.
2. I investigated KALI Linux – great version for Hacking and Pen Testing.
3. Getting a Tails USB install going.
4. Installed Ubuntu on the server twice and on the laptops 4 or 5 times. Trying out different settings etc.
5. In trying to get the remote desktop working I ran across the Mate version – which I can say I do not like as there is little consistency between the desktop console and the remote desktop console, I also didn’t care for all of the other “stuff” that it seemed to think I needed.
6. Found that Linux like most other OS’s is a memory hog and quickly filled the 250 gb drive on the laptop. So I replaced the outdated 250 gb drive with a 2TB drive. Hopeful that will last a while till it gets full.
7. Back to installing a clean version of Ubuntu .
8. This time I investigated further the Remote Desktop Client first and can across an article using XRDP and Xfce desktop. This is really the first windowing system for Linux that I like out of the box.
9. My next step is to reinstall xUnbutu on the system and see how that install goes.
With all of the fuss everyone I know makes about Linux I expected a more user friendly Os. As I mentioned before I was a “Real Unix” developer. On the Sun Micro Systems, which is the current incarnation of ATT Unix of which the source is no longer available, is a lot easier to get up and running with all the standard tools needed for development. When I was writing code to exist on the multi-platforms in the mid 1990’s it seemed easier to make things work. Of course since it was Sun the libraries and tools were geared towards efficiency. Something that seems to be lacking in the Linux world. I expect these exist but there is a lot of digging and investigation to figure out what path is the best one to use. Each product and programming environment states how they are the best and require lots of time before you can determine if you wasted your time or spent it wisely. If any of you Unix/Linux people reading this have ideas please fill me in on them so hopefully I can save some investigation time.

Purpose of this Blog

The purpose of this blog is to discuss the various aspects of designing a real world Internet based web application.  This would not be a standard html web site but one that relies heavily on data being inputted, manipulated and displayed back to the consumer.  I plan on starting by outlining the aspects of a large system design and what factors have influenced me in my designs.  See my resume for a better understanding of my background. But to give you brief summary I have in excess of 35 years of experience in the computer industry that started just prior to the advent of the microprocessor.  I have had a computer at my disposal from these early days when not to many people even know what a microprocessor was.  I then went on to designing financial trading system and on to a few large data driven web environments.  I welcome other points of view and if you wish to contribute to any blog post please do not hesitate to contact me.

Power Shell and SQL Server – Save SQL Server DB Objects To Folder

Recently I was looking to export all of the objects in one of my databases into text files so that I can add them to SVN (source code control). After investigating and writing a few T-SQL scripts that accomplished this task I ran across an article that used Power Shell to do the same function. Only it did it better. After taking a Crash Power Shell self course, I modified the example script and achieved my goal of saving the create scripts for adding them to SVN. After some tweaking, I am now able to save the db objects to text files. The way I export them, SVN will only flag changes to the files not the fact that the file date and time has changed. Cool feature of SVN to actually analyze the content.

There are a number of script options that can be set to modify the way the script gets saved. For the most part they are the same options available when you generate scripts from SSMS.

I am looking forward to using Power Shell in general server management in the future.

The script I created is listed below. Please leave a comment if you download, use it or have any suggestions for improvements.

<#
.SYNOPSIS
Save Sql Server DB Object to folder for storage into Source Code Control System.
.DESCRIPTION
This scriptlet saves SQL Server data base objects, Tables, Stored Procedures, Views and Functions to a directory
so that they can be stored in a source code control system such as SVN for safe keeping. Starting the script can be with parameters.
If no parameters are specified the program will promt for them. The files can be saved in a standard place or can be saved
in a new directory each time with the data and time as the directory name.

Parameter:
-srvname = “some default DB if desired” the SQL server we are connecting to – you can set a default in the script or you can delete it an always require the entery
-username =”sa” the SQL Server user name – you can set a default in the script or delete it to always require a value
-password the password needed for the SQL server login you will be promoted if you do not enter a value
-path the output path – you can set a default in the script or delete it to always require a value
-CreateVerbosePath =$false this setting if set to true will cause the output files to be stored in a unique directory named Date and Time of the export
.NOTES
File Name : Save_SQL_Server_DB_Objects_To_Folder.ps1
Author : Richard Siena rich@richardsiena.com
Prerequisite : PowerShell V3 over Vista and upper.
Copyright 2014 – Richard Siena
.LINK
Script posted over:
http://www.richardsiena.com/techblog
.EXAMPLE
Save_SQL_Server_DB_Objects_To_Folder.ps1
.EXAMPLE
Save_SQL_Server_DB_Objects_To_Folder.ps1 -password xyz
#>

[CmdletBinding()]
Param(
[Parameter()] [string]$srvname = “defautl server”,
[Parameter()] [string]$dbname=”default db”,
[Parameter()] [string]$username =”sa”,
[Parameter()] [string]$password ,
[Parameter()] [boolean]$CreateVerbosePath =$false,
[Parameter()] [string]$Path = “$home\Documents\SQL Server Management Studio\Projects\SQL\SQLObjects\”
)

#—————————Test for accepatable values———————
if ($srvname.Length -eq 0 ) { $srvname = $( Read-Host “Enter SQL Server host name: ” ) }
if ($dbname.Length -eq 0 ) { $dbname = $( Read-Host “Enter SQL Server dbname name: ” ) }
if ($username.length -eq 0 ) { $username = $( Read-Host “Enter User name: ” ) }
if ($password.Length -eq 0 ) { $password = $( Read-Host “Enter Password: ” ) }
if ($Path.Length -eq 0 ) { $Path = $( Read-Host “Enter Path: ” ) }

if (($srvname.Length -eq 0 ) -or ($username.length -eq 0 ) -or ($password.Length -eq 0 ) -or ($Path.Length -eq 0 ) -or ($dbname.Length -eq 0 ) )
{
write-output “Must enter all parameters use get-help for more information”
exit
}

#———— debugging data ———————–
<#
Write-Debug “srvname = ” $srvname
write-Debug “DBname = ” $dbname
write-Debug “username = ” $username
write-Debug “password = ” $password
write-Debug “CreateVerbosePath = ” $CreateVerbosePath
write-Debug “Path = ” $Path
#>
#————–done with params————–
$FullPath=””

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.ConnectionInfo”)
[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.SMO”) | out-null

$SMOserverConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
$SMOserverConn.ServerInstance=$srvname
$SMOserverConn.LoginSecure = $false
$SMOserverConn.Login = $username
$SMOserverConn.Password = $password

$SMOserver = new-object Microsoft.SqlServer.Management.SMO.Server($SMOserverConn)
$db = $SMOserver.Databases[$dbname]
#————we are now logged into the server————————–

$Objects = $db.Tables
$Objects += $db.Views
$Objects += $db.StoredProcedures
$Objects += $db.UserDefinedFunctions

#—-build paths
if ($CreateVerbosePath) {
$SavePath = $Path + $($dbname)
$DateFolder = get-date -format yyyyMMddHHmm
new-item -type directory -name “$DateFolder”-path “$SavePath”
}
else
{
$SavePath = $Path
}
#———Get the item and save it——————–
foreach ($ScriptThis in $Objects | where {!($_.IsSystemObject)}) {

#Need to Add Some mkDirs for the different $Fldr=$ScriptThis.GetType().Name

$scriptr = new-object (‘Microsoft.SqlServer.Management.Smo.Scripter’) ($SMOserver)
$scriptr.Options.AppendToFile = $True
$scriptr.Options.AllowSystemObjects = $False
$scriptr.Options.ClusteredIndexes = $True
$scriptr.Options.DriAll = $True
$scriptr.Options.IncludeHeaders = $False
$scriptr.Options.ToFileOnly = $True
$scriptr.Options.Indexes = $True
$scriptr.Options.Permissions = $True
$scriptr.Options.WithDependencies = $False

<#Script the Drop too#>
$ScriptDrop = new-object (‘Microsoft.SqlServer.Management.Smo.Scripter’) ($SMOserver)
$ScriptDrop.Options.AppendToFile = $False
$ScriptDrop.Options.AllowSystemObjects = $False
$ScriptDrop.Options.ClusteredIndexes = $True
$ScriptDrop.Options.DriAll = $True
$scriptDrop.Options.IncludeIfNotExists = $True
$ScriptDrop.Options.ScriptDrops = $True
$ScriptDrop.Options.IncludeHeaders = $False
$ScriptDrop.Options.ToFileOnly = $True
$ScriptDrop.Options.Indexes = $True
$ScriptDrop.Options.WithDependencies = $False

<#This section builds folder structures. Remove the date folder if you want to overwrite#>
$TypeFolder=$ScriptThis.GetType().Name

if ($CreateVerbosePath) {

if ((Test-Path -Path “$SavePath\$DateFolder\$TypeFolder”) -eq “true”) {
#”Scripting Out $TypeFolder $ScriptThis”
}
else {
new-item -type directory -name “$TypeFolder”-path “$SavePath\$DateFolder”
}
$FullPath = “$SavePath\$DateFolder”
}
else {

if ((Test-Path -Path “$SavePath\$TypeFolder”) -eq “true”) {
#”Scripting Out $TypeFolder $ScriptThis”
}
else {
new-item -type directory -name “$TypeFolder”-path “$SavePath”
}
$FullPath = “$SavePath”
}

“Scripting Out $TypeFolder $ScriptThis”

$ScriptFile = $ScriptThis -replace “\[|\]”
$ScriptDrop.Options.FileName = “” + $($FullPath) + “\” + $($TypeFolder) + “\” + $($ScriptFile) + “.SQL”
$scriptr.Options.FileName = “$FullPath\$TypeFolder\$ScriptFile.SQL”

#This is where each object actually gets scripted one at a time.
$ScriptDrop.Script($ScriptThis)
$scriptr.Script($ScriptThis)

} #This ends the loop

$SMOserver.ConnectionContext.Disconnect()

Before you start

When starting a new project there are a number of factors to consider. There are a multitude of choices available, your choice will most likely be based on prior experience, industry prejudice and influence from friends colleagues and popularity, whats cool at the moment. I will focus on web development here but the general rules apply to any development from systems to embedded systems development. The only thing that changes are the different products available in that particular field.

  • Platform
    • Unix
    • Linux
    • Windows
    • Other
  • Web Server
    • IIS
    • Apache
    • Other
  • Database
    • SQL Server
    • My SQL
    • Oracle
    • Other
  • Language
    • VB
    • C#
    • C++
    • PHP
    • Java
    • JSP
    • ASP
    • ASP.net
    • Python
    • Ruby
    • Rails
    • The choices here are almost endless and often may require more than one language depending on the needs of the particular functions you are implementing.  They will be dependent on your particular background and experience as well as the time available to learn a new language vs schedule pressure for the project. The resource pools you have available for programming.  the important thing here is design and consistency.
  • CSS
    • CSS 960
    • Some CSS styling method or create your own. The CSS styling is one of the most important aspects of the design.  Choosing a good one will reduce the amount of confusion and presentation rewriting over the course of the project.
  • Team
    • Team selection is critical at the beginning phase of the project. Choosing the wrong or weak team members in certain areas will cause delays further down the time line and cause rewriting of code to fix poor design and coding practices. Often one person may take on many roles listed below and should be based on actual knowledge not just a desire to work in that area.
    • DB architect
    • CSS and Presentation
    • Business analyst
    • Visionary designer
    • Programmers
    • System architect
      • Hardware architec
      • Network design
    • Software
      • Here the choice is critical will you be designing from scratch or expanding on a preexisting platform . The benefits of writing from scratch are flexibility but it may cost schedule time to implement base features that every system needs.  User management for example.
      • Build it
      • Buy it
      • Combination of both

My decision the areas listed above are based on a combination of my personal experience, resource availability, longevity of the platform and services in consideration.

One company I was with we were constantly evaluating  new platforms to migrate the product to based on what we felt the users would want and what the industry was providing.  At that time I went to two presentation by Microsoft where Steve Ballmer was the presenter.  This was in the mid to ’80s when he would give presentation direct to the derision makers.  The first presentation was on Microsoft Windows 386.  If you have not heard of it do a search for the Windows 386 video,  showing off the multi tasking aspects of Windows.  One of the more embarrassing marketing videos in the MS history. Worth watching if you want some comic relief.  The main message of the presentation was “Drop everything and move all programs you are working on to Windows 386! It’s the way of the future.”  Later that day I went to a presentation on OS2 and the message as “Drop everything and move all programs you are working on to OS2! It’s the way of the future.”

If you worked on either of the platforms, which I did, you got burnt when the products were dropped.  Leaving you with having to rewrite or redo your product in a new environment.  The lesson here is be careful when listening Software produces who just want to sell product.  Of course sometimes you have no choice when the decision is made by someone else.

Another example and is with Microsoft’s user management functions.  The change frequently with no migration path to the new methodology.  I have been burnt with at least 4 different versions of this upgrade.  The lesson here is do not become too dependent on manufactures bell and whistles in software design as they may not be supported tomorrow leaving you stuck and having to rewrite your code at a moments notice.  Unless of course you are one of the top 100 companies and can pressure the manufacture in continuing support for products they want to discontinue.

I try to use a bare minimum of features from the software supplies and do more with less.  It reduces the need for training and simplifies understanding code.  As you add new team members they can come up to speed faster.  Of course you get the age old criticism that your code is not complicated enough.  But remember it is harder to write simple code that complicated code that is unmanageable.

Another point to consider is that when designing a system, although efficiency is important, I feel flexibility is more important and I am willing to sacrifice some efficiency for being able to turn on a dime as the market dictates.  Remember that if it will take you a few months to optimize your code and you can achieve greater efficiency by throwing the latest hardware at it that runs twice as fast, is the cost of coding outweighed by the cost of the new hardware.   More on this in future posts as this is a mistake often made by team members with little practical experience.  In my experience real coding efficiencies are often in areas that are overlooked and not understood. In modern compilers often will make the code more efficient for you.

Back to starting a new project.  Build your team with a core group of people who are all on the same page.  Learn some team building techniques and set standards up front.  Document as much as you can for ease of future growth.  Sometimes your environment will dictate many of the choices for you depending if this is a start up company or a new project in a larger or existing company.  When you do not have free range on what to use, remember the best language or environment to use is the one that you are getting paid to use.

Share This