Sunday, March 25, 2012

can I export tables so that existing tables in destination database will be modified?

I'm working on an ASP.Net project where I want to test code on a local
machine using a local database as a back-end, and then export it to
the production machine where it uses the hosting provider's SQL Server
database on the back-end. Is there a way to export tables from one
SQL Server database to another in such a way that if a table already
exists in the destination database, it will be updated to reflect the
changes to the local table, without existing data in the destination
table being lost? e.g. suppose I change some tables in my local
database by adding new fields. Can I "export" these changes to the
destination database so that the new fields will be added to the
destination tables (and filled in with default values), without losing
data in the destination tables?

If I run the DTS Import/Export Wizard that comes with SQL Server and
choose "Copy table(s) and view(s) from the source database" and choose
the tables I want to copy, there is apparently no option *not* to copy
the data, and since I don't want to copy the data, that choice doesn't
work. If instead of "Copy table(s) and view(s) from the source
database", I choose "Copy objects and data between SQL Server
databases", then on the following options I can uncheck the "Copy
Data" box to prevent data being copied. But for the "Create
Destination Objects" choices, I have to uncheck "Drop destination
objects first" since I don't want to lose the existing data. But when
I uncheck that and try to do the copy, I get collisions between the
properties of the local table and the existing destination table,
e.g.:
"Table 'wbuser' already has a primary key defined on it."

Is there no way to do what I want using the DTS Import/Export Wizard?
Can it be done some other way?

-Bennett"Bennett Haselton" <bennett@.peacefire.org> wrote in message
news:e614455c.0411160123.294dfd10@.posting.google.c om...
> I'm working on an ASP.Net project where I want to test code on a local
> machine using a local database as a back-end, and then export it to
> the production machine where it uses the hosting provider's SQL Server
> database on the back-end. Is there a way to export tables from one
> SQL Server database to another in such a way that if a table already
> exists in the destination database, it will be updated to reflect the
> changes to the local table, without existing data in the destination
> table being lost? e.g. suppose I change some tables in my local
> database by adding new fields. Can I "export" these changes to the
> destination database so that the new fields will be added to the
> destination tables (and filled in with default values), without losing
> data in the destination tables?

In a word, no.

I'd suggest take the same version controlled scripts (you are of course
using version control, right? :-) you used to make the changes in Dev and
run those against the Prod server.

> If I run the DTS Import/Export Wizard that comes with SQL Server and
> choose "Copy table(s) and view(s) from the source database" and choose
> the tables I want to copy, there is apparently no option *not* to copy
> the data, and since I don't want to copy the data, that choice doesn't
> work. If instead of "Copy table(s) and view(s) from the source
> database", I choose "Copy objects and data between SQL Server
> databases", then on the following options I can uncheck the "Copy
> Data" box to prevent data being copied. But for the "Create
> Destination Objects" choices, I have to uncheck "Drop destination
> objects first" since I don't want to lose the existing data. But when
> I uncheck that and try to do the copy, I get collisions between the
> properties of the local table and the existing destination table,
> e.g.:
> "Table 'wbuser' already has a primary key defined on it."
> Is there no way to do what I want using the DTS Import/Export Wizard?
> Can it be done some other way?
> -Bennett|||"Greg D. Moore \(Strider\)" <mooregr_deleteth1s@.greenms.com> wrote in message news:<7Mnmd.16$1u.15@.twister.nyroc.rr.com>...
> "Bennett Haselton" <bennett@.peacefire.org> wrote in message
> news:e614455c.0411160123.294dfd10@.posting.google.c om...
> > I'm working on an ASP.Net project where I want to test code on a local
> > machine using a local database as a back-end, and then export it to
> > the production machine where it uses the hosting provider's SQL Server
> > database on the back-end. Is there a way to export tables from one
> > SQL Server database to another in such a way that if a table already
> > exists in the destination database, it will be updated to reflect the
> > changes to the local table, without existing data in the destination
> > table being lost? e.g. suppose I change some tables in my local
> > database by adding new fields. Can I "export" these changes to the
> > destination database so that the new fields will be added to the
> > destination tables (and filled in with default values), without losing
> > data in the destination tables?
> In a word, no.
> I'd suggest take the same version controlled scripts (you are of course
> using version control, right? :-) you used to make the changes in Dev and
> run those against the Prod server.

Uh, no... my "script" is that I move the mouse pointer over the
"wbuser" table in Server Explorer in VS .Net, right-click, pick
"Design Table", and add columns that way :)

So I guess I'll just have to remember to run that "script" against the
Prod database whenever I make any changes.

Still, I would have thought that what I was trying, would be a pretty
common thing for people to want to do... surprised it's not built-in.

-Bennett|||Bennett Haselton (bennett@.peacefire.org) writes:
> Uh, no... my "script" is that I move the mouse pointer over the
> "wbuser" table in Server Explorer in VS .Net, right-click, pick
> "Design Table", and add columns that way :)
> So I guess I'll just have to remember to run that "script" against the
> Prod database whenever I make any changes.
> Still, I would have thought that what I was trying, would be a pretty
> common thing for people to want to do... surprised it's not built-in.

There are third-party tools. Red Gate's SQL Compare is popular.

One reason that this is not built-in is that changes to tables is far
from always something that can be performed automatically. In our shop
we keep all table definitions under source control. To generate an update
script we have a tool that reads SourceSafe between two labels to load
all new and changed objects. For changed tables you get a script that
copies data from the old definition to the new, but this script is really
only a template. In many cases I need to change the script, because a
new column is supposed be initiated with some values derived from
other columns or tables. And there might be even wilder changes.

Point and click may be convenient, but for something which is to be
run several times needs to be automized to be sure that all are updates
are carried out the same way. Thus, you need a script.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

No comments:

Post a Comment