Search Results

Search found 85897 results on 3436 pages for 'flat file source'.

Page 38/3436 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Database source control

    - by Bojan Skrchevski
    Should database files(scripts etc.) be on source control? If so, what is the best method to keep it and update it there? Is there even a need for database files to be on source control since we can put it on a development server where everyone can use it and make changes to it if needed. But, then we can't get it back if someone messes it up. What approach is best used for databases on source-control?

    Read the article

  • SSIS DTS Package flat file error - "The file name specified in the connection was not valid"

    - by MisterZimbu
    I have a pretty basic SSIS package that is attempting to read a file hosted on a share, and import its contents to a database table. The package runs fine when I run it manually within SSIS. However, when I set up a SQL Agent job and attempt to execute it, I get the following error: Executed as user: DOMAIN\UserName. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:14:17 AM Error: 2010-05-03 10:14:17.75 Code: 0xC001401E Source: DataImport Connection manager "Data File Local" Description: The file name "\10.1.1.159\llpf\datafile.dat" specified in the connection was not valid. End Error Error: 2010-05-03 10:14:17.75 Code: 0xC001401D Source: DataAnimalImport Description: Connection "Data File Local" failed validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:14:17 AM Finished: 10:14:17 AM Elapsed: 0.594 seconds. The package execution failed. The step failed. This leads me to believe it's a permissions issue, but every attempt I've made to fix it has failed. What I've tried so far: Run as the SQL Agent account (DOMAIN\SqlAgent) - yields same error. DOMAIN\SqlAgent has "Full Control" permissions on both the share and the uploaded file. Set up a proxy account with a different account's credentials (DOMAIN\Account) - yields same error. Like above, "Full Control" permissions were given over the share to that account. Gave "Everyone" full control permissions over the share (temporarily!). Yielded same error. Manually copied the file to a local path and tested with the SQL Agent account. Worked properly. Added an ActiveX script task that would first copy the remotely hosted file to a local path, and then have the DTS package reference the local file. Gave a completely nondescriptive (even by SSIS standards) error when trying to run the script. Set up a proxy account, using my own personal account's credentials - worked correctly. However, this is not an acceptable solution as there are password policies in place on my account, as well as being a bad practice to set things up this way in general. Any ideas? I'm still convinced it's a permissions issue. However, what I've read from various searches more or less says giving the executing account permissions on the share should work. However, this is not the case here (unless I'm missing something obscure when I'm setting up permissions on the share).

    Read the article

  • Creating reproducible builds to verify Free Software

    - by mikkykkat
    Free Software is about freedom and privacy, Open Source software is great but making that fully practical usually won't happen. Most Free Software developers publicize binaries that we can't verify are really compiled from the source code or have something bad injected already! We have the freedom to change the code, but privacy for ordinary users is missing. For desktop software there is a lot of languages and opportunities to create Free Software with a reproducible build process (compiling source code to always produce the exact same binary), but for mobile computing I don't know if same thing is possible or not? Mobile devices are probably the future of computing and Android is the only Open Source environment so far which accept Java for coding. Compiling same Android application won't result in the exact same binary every time. For Open Source Android apps how we can verify the produced binary (.apk) is really compiled from the source code? Is there any way to create reproducible builds from the Android SDK or does Java fail here for Free Software? is there any java software ever wrote with a reproducible build?

    Read the article

  • Shared Data Source name error underscore characters added

    - by mick
    The name of our shared data source in RS (report server) is "AF1 Live Database" (no underscore characters - just spaces between words) and is the same in report builder in VS. However, the following error pops up when the RDL of this report is uploaded onto our company site and run. (error we are receiving...) The report server cannot process the report or shared dataset. The shared data source 'AF1_Live_Database' for the report server or SharePoint site is not valid. Browse to the server or site and select a shared data source. (rsInvalidDataSourceReference) We have no idea why the error reports the shared data source as 'AF1_Live_Database' with underscore characters? As this appears to be the problem that keeps the report from running we are seeking your help, thanks.

    Read the article

  • Utility or technique for swapping files quickly in Windows

    - by foraidt
    I frequently need to swap one file with another, without overwriting the original. Let's say there are two files, foo_new.dll and foo.dll. I usually rename them the follwing way: foo.dll - foo_old.dll, foo_new.dll - foo.dll, [do something with replaced file], foo.dll - foo_new.dll, foo_old.dll - foo.dll. This is ok for a single file to swap but it becomes tedious when swapping multiple files at once. Is there a Windows (7 and preferrably XP) utility or a technique that simplifies this task and works well when swapping multiple files? I'd prefer to be able to use it from within FreeCommander but Windows Explorer would be ok, too.

    Read the article

  • How to protect compiled Java classes?

    - by Registered User
    I know, many similar questions has been asked here. I am not asking if I can protect my compiled Java class - because obviously you will say 'no you can't'. I am asking what is the best known method of protecting Java classes against de-compiling? If you aware of any research or academic paper in this field please do let me know. Also if you have used some methods or software please share you experience? Any kind of information will be very useful. Thank you.

    Read the article

  • Sweet and Sour Source Control

    - by Tony Davis
    Most database developers don't use Source Control. A recent anonymous poll on SQL Server Central asked its readers "Which Version Control system do you currently use to store you database scripts?" The winner, with almost 30% of the vote was...none: "We don't use source control for database scripts". In second place with almost 28% of the vote was Microsoft's VSS. VSS? Given its reputation for being buggy, unstable and lacking most of the basic features required of a proper source control system, answering VSS is really just another way of saying "I don't use Source Control". At first glance, it's a surprising thought. You wonder how database developers can work in a team and find out what changed, when the system worked before but is now broken; to work out what happened to their changes that now seem to have vanished; to roll-back a mistake quickly so that the rest of the team have a functioning build; to find instantly whether a suspect change has been deployed to production. Unfortunately, the survey didn't ask about the scale of the database development, and correlate the two questions. If there is only one database developer within a schema, who has an automated approach to regular generation of build scripts, then the need for a formal source control system is questionable. After all, a database stores far more about its metadata than a traditional compiled application. However, what is meat for a small development is poison for a team-based development. Here, we need a form of Source Control that can reconcile simultaneous changes, store the history of changes, derive versions and builds and that can cope with forks and merges. The problem comes when one borrows a solution that was designed for conventional programming. A database is not thought of as a "file", but a vast, interdependent and intricate matrix of tables, indexes, constraints, triggers, enumerations, static data and so on, all subtly interconnected. It is an awkward fit. Subversion with its support for merges and forks, and the tolerance of different work practices, can be made to work well, if used carefully. It has a standards-based architecture that allows it to be used on all platforms such as Windows Mac, and Linux. In the words of Erland Sommerskog, developers should "just do it". What's in a database is akin to a "binary file", and the developer must work only from the file. You check out the file, edit it, and save it to disk to compile it. Dependencies are validated at this point and if you've broken anything (e.g. you renamed a column and broke all the objects that reference the column), you'll find out about it right away, and you'll be forced to fix it. Nevertheless, for many this is an alien way of working with SQL Server. Subversion is the powerhouse, not the GUI. It doesn't work seamlessly with your existing IDE, and that usually means SSMS. So the question then becomes more subtle. Would developers be less reluctant to use a fully-featured source (revision) control system for a team database development if they had a turn-key, reliable system that fitted in with their existing work-practices? I'd love to hear what you think. Cheers, Tony.

    Read the article

  • New Source Database Added for EBS 12 + 11gR2 Transportable Tablespaces

    - by John Abraham
    The Transportable Tablespaces (TTS) process was originally certified for the migration of E-Business Suite R12 databases going from a source database of 11gR1 or 11gR2 to a target of 11gR2. This requirement has now been expanded to include a source database of 10gR2 (10.2.0.5) - this will potentially save time for existing 10gR2 customers as they can remove on a crucial upgrade step prior to performing the platform migration. The migration process requires an updated Controlled patch delivered by the Oracle E-Business Suite Platform Engineering team, i.e. it requires a password obtainable from Oracle Support. We released the patch in this manner to gauge uptake, and help identify and monitor any customer issues due to the nature of this technology. This patch has been updated to now include supporting 10gR2 as a source database. Does it meet your requirements?Note that for migration across platforms of the same "endian" format, users are advised to use the Transportable Database (TDB) migration process instead for large databases. The "endian-ness" target platforms can be verified by querying the view V$DB_TRANSPORTABLE_PLATFORM using SQL*Plus (connected as sysdba) on the source platform:SQL>select platform_name from v$db_transportable_platform;If the intended target platform does not appear in the output, it means that it is of a different endian format from the source. Consequently. database migration will need to be performed via Transportable Tablespaces (for large databases) or export/import.The use of Transportable Tablespaces can greatly speed up the migration of the data portion of the database. However, it does not affect metadata, which must still be migrated using export/import. We recommend that users initially perform a test migration on their database, using export/import with the 'metrics=y' parameter. This will help identify the relative amounts of data and metadata, and provide a basis for assessing likely gains in timing. In general, the larger the amount of data (compared to metadata), the greater the reduction in downtime that can be expected from using TTS as a migration process. For smaller databases or for those that have relatively small data compared to metadata, TTS will not be as beneficial for cross endian migration and the use of export/import (datapump) for the whole database is recommended. Where can I find more information? Using Transportable Tablespaces to Migrate Oracle E-Business Suite Release 12 Using Oracle Database 11g Release 2 Enterprise Edition (My Oracle Support Document 1311487.1) Oracle Database Administrator's Guide 11g Release 2 (11.2) Related Articles Database Migration using 11gR2 Transportable Tablespaces Now Certified for EBS 12 New Source Databases Added for Transportable Tablespaces + EBS 11i 10gR2 Transportable Tablespaces Certified for EBS 11i Migrating E-Business Suite Release 11i Databases Between Platforms Migrating E-Business Suite Release 12 Databases Between Platforms

    Read the article

  • elffile: ELF Specific File Identification Utility

    - by user9154181
    Solaris 11 has a new standard user level command, /usr/bin/elffile. elffile is a variant of the file utility that is focused exclusively on linker related files: ELF objects, archives, and runtime linker configuration files. All other files are simply identified as "non-ELF". The primary advantage of elffile over the existing file utility is in the area of archives — elffile examines the archive members and can produce a summary of the contents, or per-member details. The impetus to add elffile to Solaris came from the effort to extend the format of Solaris archives so that they could grow beyond their previous 32-bit file limits. That work introduced a new archive symbol table format. Now that there was more than one possible format, I thought it would be useful if the file utility could identify which format a given archive is using, leading me to extend the file utility: % cc -c ~/hello.c % ar r foo.a hello.o % file foo.a foo.a: current ar archive, 32-bit symbol table % ar r -S foo.a hello.o % file foo.a foo.a: current ar archive, 64-bit symbol table In turn, this caused me to think about all the things that I would like the file utility to be able to tell me about an archive. In particular, I'd like to be able to know what's inside without having to unpack it. The end result of that train of thought was elffile. Much of the discussion in this article is adapted from the PSARC case I filed for elffile in December 2010: PSARC 2010/432 elffile Why file Is No Good For Archives And Yet Should Not Be Fixed The standard /usr/bin/file utility is not very useful when applied to archives. When identifying an archive, a user typically wants to know 2 things: Is this an archive? Presupposing that the archive contains objects, which is by far the most common use for archives, what platform are the objects for? Are they for sparc or x86? 32 or 64-bit? Some confusing combination from varying platforms? The file utility provides a quick answer to question (1), as it identifies all archives as "current ar archive". It does nothing to answer the more interesting question (2). To answer that question, requires a multi-step process: Extract all archive members Use the file utility on the extracted files, examine the output for each file in turn, and compare the results to generate a suitable summary description. Remove the extracted files It should be easier and more efficient to answer such an obvious question. It would be reasonable to extend the file utility to examine archive contents in place and produce a description. However, there are several reasons why I decided not to do so: The correct design for this feature within the file utility would have file examine each archive member in turn, applying its full abilities to each member. This would be elegant, but also represents a rather dramatic redesign and re-implementation of file. Archives nearly always contain nothing but ELF objects for a single platform, so such generality in the file utility would be of little practical benefit. It is best to avoid adding new options to standard utilities for which other implementations of interest exist. In the case of the file utility, one concern is that we might add an option which later appears in the GNU version of file with a different and incompatible meaning. Indeed, there have been discussions about replacing the Solaris file with the GNU version in the past. This may or may not be desirable, and may or may not ever happen. Either way, I don't want to preclude it. Examining archive members is an O(n) operation, and can be relatively slow with large archives. The file utility is supposed to be a very fast operation. I decided that extending file in this way is overkill, and that an investment in the file utility for better archive support would not be worth the cost. A solution that is more narrowly focused on ELF and other linker related files is really all that we need. The necessary code for doing this already exists within libelf. All that is missing is a small user-level wrapper to make that functionality available at the command line. In that vein, I considered adding an option for this to the elfdump utility. I examined elfdump carefully, and even wrote a prototype implementation. The added code is small and simple, but the conceptual fit with the rest of elfdump is poor. The result complicates elfdump syntax and documentation, definite signs that this functionality does not belong there. And so, I added this functionality as a new user level command. The elffile Command The syntax for this new command is elffile [-s basic | detail | summary] filename... Please see the elffile(1) manpage for additional details. To demonstrate how output from elffile looks, I will use the following files: FileDescription configA runtime linker configuration file produced with crle dwarf.oAn ELF object /etc/passwdA text file mixed.aArchive containing a mixture of ELF and non-ELF members mixed_elf.aArchive containing ELF objects for different machines not_elf.aArchive containing no ELF objects same_elf.aArchive containing a collection of ELF objects for the same machine. This is the most common type of archive. The file utility identifies these files as follows: % file config dwarf.o /etc/passwd mixed.a mixed_elf.a not_elf.a same_elf.a config: Runtime Linking Configuration 64-bit MSB SPARCV9 dwarf.o: ELF 64-bit LSB relocatable AMD64 Version 1 /etc/passwd: ascii text mixed.a: current ar archive, 32-bit symbol table mixed_elf.a: current ar archive, 32-bit symbol table not_elf.a: current ar archive same_elf.a: current ar archive, 32-bit symbol table By default, elffile uses its "summary" output style. This output differs from the output from the file utility in 2 significant ways: Files that are not an ELF object, archive, or runtime linker configuration file are identified as "non-ELF", whereas the file utility attempts further identification for such files. When applied to an archive, the elffile output includes a description of the archive's contents, without requiring member extraction or other additional steps. Applying elffile to the above files: % elffile config dwarf.o /etc/passwd mixed.a mixed_elf.a not_elf.a same_elf.a config: Runtime Linking Configuration 64-bit MSB SPARCV9 dwarf.o: ELF 64-bit LSB relocatable AMD64 Version 1 /etc/passwd: non-ELF mixed.a: current ar archive, 32-bit symbol table, mixed ELF and non-ELF content mixed_elf.a: current ar archive, 32-bit symbol table, mixed ELF content not_elf.a: current ar archive, non-ELF content same_elf.a: current ar archive, 32-bit symbol table, ELF 64-bit LSB relocatable AMD64 Version 1 The output for same_elf.a is of particular interest: The vast majority of archives contain only ELF objects for a single platform, and in this case, the default output from elffile answers both of the questions about archives posed at the beginning of this discussion, in a single efficient step. This makes elffile considerably more useful than file, within the realm of linker-related files. elffile can produce output in two other styles, "basic", and "detail". The basic style produces output that is the same as that from 'file', for linker-related files. The detail style produces per-member identification of archive contents. This can be useful when the archive contents are not homogeneous ELF object, and more information is desired than the summary output provides: % elffile -s detail mixed.a mixed.a: current ar archive, 32-bit symbol table mixed.a(dwarf.o): ELF 32-bit LSB relocatable 80386 Version 1 mixed.a(main.c): non-ELF content mixed.a(main.o): ELF 64-bit LSB relocatable AMD64 Version 1 [SSE]

    Read the article

  • Data Generator Source Adapter

    This component needs little explanation. It generates random integer (DT_I4) and string (DT_WSTR) data and places them in the pipeline. You specify how many columns of each you would like and for any string columns you pass a fixed length value. You then need to specify how many rows in total you require to be generated. This component is used by us to do testing of the pipeline and components downstream. Previously we would have used a script component (as a source) to generate the rows but found ourselves rewriting the code too often so created this component. Screenshots SQL Server 2005 Integration Services SQL Server 2008/2012 Integration Services The component is provided as an MSI file, however to complete the installation, you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Data Generator Source from the list. Downloads The Data Generator Source Adapter is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Data Generator Source Adapter for SQL Server 2005 Data Generator Source Adapter for SQL Server 2008 Data Generator Source Adapter for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.30 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.29 - SQL Server 2008 February 2008 CTP. Includes support for upgrade of 2005 packages. Simplified user interface. (4 Mar 2008) Version 2.0.0.27 - SQL Server 2008 November 2007 CTP. String columns will now use the default system code page. Previously string columns always used 1252. (15 Feb 2008) SQL Server 2005 Version 1.1.0.23 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. (12 Jun 2006) Version 1.0.0.0 - SQL Server 2005 IDW 16 Sept CTP. Public release. (6 Oct 2005)

    Read the article

  • ForgeRock Picks Up Sun's Open Source Identity

    <b>Datamation:</b> "Among the promises of open source software is that there is no vendor lock-in. It's a promise that new open source startup ForgeRock is aiming to deliver upon by supporting and extending the OpenSSO open source single sign-on and identity management platform formerly supported by Sun Microsystems."

    Read the article

  • Difficulties of transforming a Java program to another language (or vice versa)

    - by NomeN
    Is there anything simple Java can't do that can be done in a similar language or vice versa? Lets say you have a piece of software in language X and you rewrite it entirely to Java (or the other way around), what are the little things that would seriously hamper the translation? At first I was thinking of comprehensions or multiple exit loops, but these are easily rewritten with a for_each loop with an if statement and a local variable respectively. Maybe Exceptions? But which language does not have a similar construct? Polymorphism? But I don't see how I could show that in a few lines. I'm looking for a short and sweet example, that would give some serious headache to work around.

    Read the article

  • Software licensing template that gives room for restricting usage to certain industries/uses of software/source

    - by BSara
    *Why this question is not a duplicate of the questions specified as such: I did not ask if there was a license that restricted specific uses and I did not ask if I could rewrite every line of any open source project. I asked very specifically: "Does there exist X? If not, can I Y with Z?". As far as I can tell, the two questions that were specified as duplicates do not answer my specific question. Please remove the duplicate status placed on the question. I'm developing some software that I would like to be "semi" open source. I would like to allow for anyone to use my software/source unless they are using the software/source for certain purposes. For example, I don't want to allow usage of the software/source if it is being used to create, distribute, view or otherwise support pornography, illegal purposes, etc. I'm no lawyer and couldn't ever hope to write a license myself nor do I have to time to figure how to best do this. My question is this: Does there exist a freely available license or a template for a license that I can use to license my software under they conditions explained above just like one can use the Creative Commons licenses? If not, am I allowed to just alter one of Creative Commons licenses to meet my needs?

    Read the article

  • Source Browsing in FireFox

    - by lavanyadeepak
    Source Browsing in FireFox Just casually observed this a few minutes back with my Mozilla Firefox 3.6.3. When you do a view source of any  page in Internet Explorer it just renders as editable inoperative HTML. However in Firefox the hyperlinks are shown clickable and active. When you click on any hyperlink the most obvious and expected output would be that the target page would appear in one of the new tab in the parent browser. However the View Source window refreshed with the HTML source of the new page. I believe this gesture of Firefox would help us to take a journey back into Lynx Text Browsing in a way.

    Read the article

  • VB.NET 2008, Windows 7 and saving files

    - by James Brauman
    Hello, We have to learn VB.NET for the semester, my experience lies mainly with C# - not that this should make a difference to this particular problem. I've used just about the most simple way to save a file using the .NET framework, but Windows 7 won't let me save the file anywhere (or anywhere that I have found yet). Here is the code I am using to save a text file. Dim dialog As FolderBrowserDialog = New FolderBrowserDialog() Dim saveLocation As String = dialog.SelectedPath ... Build up output string ... Try ' Try to write the file. My.Computer.FileSystem.WriteAllText(saveLocation, output, False) Catch PermissionEx As UnauthorizedAccessException ' We do not have permissions to save in this folder. MessageBox.Show("Do not have permissions to save file to the folder specified. Please try saving somewhere different.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error) Catch Ex As Exception ' Catch any exceptions that occured when trying to write the file. MessageBox.Show("Writing the file was not successful.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error) End Try The problem is that this using this code throws an UnauthorizedAccessException no matter where I try to save the file. I've tried running the .exe file as administrator, and the IDE as administrator. Is this just Windows 7 being overprotective? And if so, what can I do to solve this problem? The requirements state that I be able to save a file! Thanks.

    Read the article

  • Tax Deducted At Source (TDS) for India Localizations

    - by LuciaC
    Do you have questions about TDS (Tax deducted at source) for India Localizations or want to know the latest information about this functionality? See Doc ID 1546099.1 TDS Tax Deduction at Source for India - Master Troubleshooting Guide. The document includes sections with the following information: Documentation and Setup of Tax Deduction at Source – this section contains a presentation with the configuration steps for the TDS feature Resolving errors – this section contains recommended patches and documents with solutions for specific errors Frequently asked questions  – See also our new FAQ Doc ID 1549522.1 for frequently asked questions about TDS.

    Read the article

  • compressed archive with quick access to individual file

    - by eric.frederich
    I need to come up with a file format for new application I am writing. This file will need to hold a bunch other text files which are mostly text but can be other formats as well. Naturally, a compressed tar file seems to fit the bill. The problem is that I want to be able to retrieve some data from the file very quickly and getting just a particular file from a tar.gz file seems to take longer than it should. I am assumeing that this is because it has to decompress the entire file even though I just want one. When I have just a regular uncompressed tar file I can get that data real quick. Lets say the file I need quickly is called data.dat For example the command... tar -x data.dat -zf myfile.tar.gz ... is what takes a lot longer than I'd like. MP3 files have id3 data and jpeg files have exif data that can be read in quickly without opening the entire file. I would like my data.dat file to be available in a similar way. I was thinking that I could leave it uncompressed and seperate from the rest of the files in myfile.tar.gz I could then create a tar file of data.dat and myfile.tar.gz and then hopefully that data would be able to be retrieved faster because it is at the head of outer tar file and is uncompressed. Does this sound right?... putting a compressed tar inside of a tar file? Basically, my need is to have an archive type of file with quick access to one particular file. Tar does this just fine, but I'd also like to have that data compressed and as soon as I do that, I no longer have quick access. Are there other archive formats that will give me that quick access I need? As a side note, this application will be written in Python. If the solution calls for a re-invention of the wheel with my own binary format I am familiar with C and would have no problem writing the Python module in C. Idealy I'd just use tar, dd, cat, gzip, etc though. Thanks, ~Eric

    Read the article

  • SQL Source Control Contest

    - by Ajarn Mark Caldwell
    If you’re a regular reader of this blog, you know that I have written several posts about how important I think it is to protect your source code, to version it, and in particular, all the aspects I like about Red Gate’s SQL Source Control product.  But for a moment, let’s take a break from my writing and I want to hear your stories.  What nightmare situation are you in, or can you imagine, where source control for your database would save the world.  Or maybe your life is not so dramatic, but you do see a challenge that, if you just had a good tool like SQL Source Control, it would go much smoother.  What’s your pain?  You have read my writings, now tell me your story, and be in the running for a free copy of SQL Source Control from Red Gate. Yes, that’s right.  Although I am just a fan of Red Gate, they have authorized me to give out a handful of licenses to blog readers who are willing to share their story by posting a comment to this blog entry.  Simply add your comment below (be sure to include a valid email address in the box that asks for that) to be entered.  The contest starts immediately and over the next few days, the best stories will win.

    Read the article

  • Webinar: Whatever your source control system - seamlessly link it to SQL Server

    In this webinar consisting of 30 minutes of software demonstrations followed by Q&A, you will learn how to link your database to your existing source control system within SQL Server Management Studio using Red Gate’s SQL Source Control. We will also give you an exclusive preview of forthcoming custom scripts features in the next version of SQL Source Control and SQL Compare. Get smart with SQL Backup ProPowerful centralised management, encryption and more.SQL Backup Pro was the smartest kid at school. Discover why.

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >