Search Results

Search found 35976 results on 1440 pages for 'js test driver'.

Page 33/1440 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • WinXp display driver incorrect to boot iso ubuntu 12.04

    - by user285829
    I have Windows Xp on a HP Pavilion w/AMD Athlon 64 processor 3300+. I have downloaded Ubuntu 12.04 iso desktop. When trying to run Ubuntu, I get a line that says "sis 630 buss not detected. It continues and makes a blue screen, then says low graphics mode only and has a white screen. The HP has display adaptor sis 760, driver= 6.14.10.3671, 4/20/2005. I'm not familiar with ubuntu because I've never got it started as yet. Can the Ubuntu install process be changed to accept my sis 760 driver? I'm trying to make a Live CD(dvd) to run Ubuntu.

    Read the article

  • installing win7 x64 in VirtualBox - error driver device missing

    - by chrisjlee
    After creating a vhd and going through that process i'm unable to install win7 x64. I've went through various settings and trying to get it right. It'll boot the windows install but then indicate the following error (image included too): A required cd/dvd drive device driver is missing. If you have a driver floppy disk, CD, DVD, or USB flash drive, please insert it now. Note if the windows installation media is in the CD/DVD drive, you can safely remove it for this step. I've also set up my storage tree like so: Would anyone know what the issue could be? It's asking for cd/dvd device drivers and i'm unable to install.

    Read the article

  • Broadcom STA wireless driver frezees

    - by Srki
    Upon updating my 10.10 netbook to 11.04 on the first startup it froze. I checked the forums and found that disabling wlan in BIOS fixes the problem so I did that and it worked, system was ok and it didn't freeze but then I had no wireless. I know that Broadcom STA driver is to blame cause I removed it from "additional drivers" and enabled back in bios wlan and everything worked fine (exept no wireless cause no driver is activated). Enabling Broadcom STA in the additional drivers again and restarting wireless was up and it found my network and connected to it but in a matter of seconds netbook froze. Can you please help me...

    Read the article

  • How to fix additional STA driver installation error on 12.04

    - by nibl
    I have a Broadcom BCM4313 wireless driver. It worked fine under 11.04 and the upgrade to 11.10 also went smoothly. The 12.04 upgrade has broken the wireless connection. I've seen several posts about this and all presume you can activate the STA drivers, but that's where it fails on this machine. The log is in /var/log/jockey.log: WARNING: modinfo for module wl failed: ERROR: modinfo: could not find module wl WARNING: /sys/module/wl/drivers does not exist, cannot rebind wl driver DEBUG: BroadcomWLHandler enabled(): kmod disabled, bcm43xx: blacklisted, b43: blacklisted, b43legacy: blacklisted (repeat of last error several times) Any ideas how to proceed?

    Read the article

  • Cannot make NVIDIA driver work with Ubuntu 12.10

    - by user1293231
    I seem to have a problem similar to many but I didn't manage to get it solved: have a Lenovo N581 with an NVIDIA GeForce 610M have just installed a fresh Ubuntu 12.10 64 bits, + KDE and am trying to have my NVIDIA card work. Have tried all workarounds posted: purge nvidia, install kernel source/headers and then reinstall nvidia-current-updates (or just nvidia-current), do "sudo nvidia-xconfig". It does create a xorg.conf but does not much (no Module Section by the way). The result is that my system (jokey) tells me that the driver is there but not in use and I only get a 640x480 resolution. If I try to launch nvidia-settings it does indeed tell me that the nvidia driver is not used. I do all this under kde but I guess it does matter at this stage. Any hint of how to resolve this? I feel stuck and cannot use any of the acceleration which is partly why I got that laptop in the first place... thanks for any help/advise you may provide!

    Read the article

  • TX 1000 Nvidia driver problem

    - by Marduk
    Hello i installed the news ubuntu the other day and istalled all the drivers and then restart the laptop. then i get the login screen with no problem then when i login the black screen in about 10 mins a window popup saying that the application compiz has closed unexpectedly i relunch the app. and then it happens again. i have uninstalled nvidia driver then restarted and the OS works fine.. tryed older driver and still the same problem. anyone out there can help would be great.

    Read the article

  • how to un-install a special driver/kernel in 13.04

    - by hako
    I have tried to install a special ATI driver using a method described in http://debianhelp.wordpress.com/2012/09/28/to-do-list-after-installing-ubuntu-13-04-aka-raring-ringtail-operating-system/ Details: Alternative ATI Legacy Video Driver PPA installation for (for < 5000 series cards): sudo apt-get remove --purge fglrx fglrx_* fglrx-amdcccle* fglrx-dev* sudo add-apt-repository ppa:makson96/fglrx sudo apt-get update sudo apt-get upgrade sudo apt-get install linux-headers-generic fglrx-legacy sudo aticonfig --initial And then reboot. There were a couple of warnings at "sudo apt-get install linux-headers-generic fglrx-legacy" with update-alternatives. But, at the last step I got "aticonfig: No supported adapters detected". Unity does not start anymore. I only can login to Gnome. How can I get back to a working system?

    Read the article

  • Virtualbox Kernel driver not installed

    - by Cyndi
    I tried to use my Virtual Box and this is the error I received - (I just updated to the new Ubuntu also) and I am VERY new at this Linux OS :) Kernel driver not installed (rc=-1908) The VirtualBox Linux kernel driver (vboxdrv) is either not loaded or there is a permission problem with /dev/vboxdrv. Please reinstall the kernel module by executing '/etc/init.d/vboxdrv setup' as root. Users of Ubuntu, Fedora or Mandriva should install the DKMS package first. This package keeps track of Linux kernel changes and recompiles the vboxdrv kernel module if necessary.

    Read the article

  • RHEL 5.x SCSI Driver .img update for initrd.img

    - by zmische
    I have production DB server (RHEL 5.1) with LSI Megaraid driver loaded during setup via DUD (Driver Upd Diskett). Now I'd like to update kernel and other packeges to 5.4 version. I also downloaded new LSI SCSI driver (megasr-13.11.0922.2009-1-rhel50-u4-all.img ). Could you explain the necessary steps to make this driver visible on boot for new kernel (lets assume, that I've already update kernel to 5.4)? I read the article on Redhat - "How do I add a driver to the initrd.img". Does it contain all the steps I need? Thanks in advance!

    Read the article

  • nVidia Driver - Laptop is forced as one of my monitors

    - by vaccano
    I am trying to get my nVidia Driver to correctly configure my multi-monitor setup. I have my laptop in a docking station and two monitors hooked up to it. I had an old driver that this worked correctly with. However, that driver was causing a lot of "Deferred Procedure Calls" so I upgraded to a newer driver. But now I am forced to use my Laptop monitor as one of my monitors. Here is the image in the nVida Control Panel: As you can see, both monitors are recognized, but the only options available are to use one of them with the Laptop Display. Any Ideas? I am running Windows XP (latest updates), I have an nVidia Quadro 1500M. I have tried several different driver versions and all the new ones have this issue.

    Read the article

  • Default Webcam Driver Issues

    - by Omegaclawe
    I'm having troubles getting my monitor-attached webcam (ASUS VK248H) to install on my new computer. On the old computer, it was a matter of not using a USB 3.0 port, but I can't get anything to work on the new one. I have tried all manner of uninstalling/reinstalling the driver and resetting the computer, as well as literally every USB port on the computer (14 in total). It's not that windows isn't recognizing the device; it most certainly is. However, comparing it to the old computer's driver details, on the new computer, it is not using the ksthunk.sys driver in addition to the usbvideo.sys driver, like on the old (working) computer. Naturally, I figured the way ahead was to simply get this other driver to work with the hardware, but haven't really found out a way to do that. Does anyone know of a way I can force it to use ksthunk.sys? It seems rather difficult to get it to install anything when Windows is feeling that everything is peachy.

    Read the article

  • Intel HD 4000 driver not working

    - by Sagar Parakh
    I have a Dell Inspiron 15r se 7520. I have upgraded my system to Windows 8.1 few days back. After the upgrade, my Intel HD 4000 graphics driver stopped working. I have downloaded the latest driver from Dell website but during installation it said that my graphics driver is not compatible or validated and also my dedicated graphics driver AMD ATI Raedon HD 7730m also stopped working. There is also a problem with my screen brightness: I am unable to change it. How to make my graphics driver work?

    Read the article

  • Installing ethernet controller driver after clean windows XP install

    - by user1488804
    I've been having trouble getting internet access through an ethernet connection since doing a fresh install of windows XP. It appears that I'm missing the driver for the ethernet controller. Looking in 'device manager' under the network adapters section there's an apparently working '1394 Net Adaptor' but under unknown devices there's the question-marked ethernet controller, which I believe is what I want the driver for. I found an old cd that contains the driver for it (Realtek 8139 or some 81xx). There's no setup tool, but only 3 files (a .cat file, .inf file and .sys file). I've tried right-clicking on the .inf file and using 'install' and also tried just copying the .sys file into the system32/drivers folder but neither have got me anywhere after restarting. The ethernet controller is still listed in 'unknown devices'. EDIT: I've also tried updating the driver in the device manager, pointing it at the folder containing the driver, but this hasn't worked either. Any ideas? Thanks

    Read the article

  • Logitech USB keyboard driver not found on Windows 7 x64

    - by AngCaruso
    I have a Logitech Internet 350 Keyboard which has been working fine on my Lenovo T400 laptop for well over three years. Just within the past week or so, Windows 7 can no longer find the driver for it. There is no custom driver from Logitech for this device -- it uses the generic Windows USB HID driver. The keyboard works just fine from the BIOS (and Linux which I dual boot this machine), but Windows 7 cannot find or load the driver for it. Any ideas? I smell a Windows update problem, but have no idea how to fix it, and I really am not interested in rolling back updates. New Info: I just tested a generic Dell USB keyboard and it worked just find, with Windows immediately recognizing the device and installing the HID keyboard class driver. So, it seems that Windows has decided not to recognize my specific Logitech keyboard device. I still suspect a Windows update issue, but I would love to hear other suggestions.

    Read the article

  • Compiling JS-Test-Driver Plugin and Installing it on Eclipse 3.5.1 Galileo?

    - by leeand00
    I downloaded the source of the js-test-driver from: http://js-test-driver.googlecode.com/svn/tags/1.2 It compiles just fine, but one of the unit tests fails: [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 0.012 sec [junit] Test com.google.jstestdriver.eclipse.ui.views.FailureOnlyViewerFilterTest FAILED I am using: - ANT 1.7.1 - javac 1.6.0_12 And I'm trying to install the js-test-driver plugin on Eclipse 3.5.1 Galileo Despite the failed test I installed the plugin into my C:\eclipse\dropins\js-test-driver directory by copying (exporting from svn) the compiled feature and plugins directories there, to see if it would yield any hints to what the problem is. When I started eclipse, added the plugin to the panel using Window-Show View-Other... Other-JsTestDriver The plugin for the panel is added, but it displays the following error instead of the plugin in the panel: Could not create the view: Plugin com.google.jstestdriver.eclipse.ui was unable to load class com.google.jstestdriver.eclipse.ui.views.JsTestDriverView. And then bellow that I get the following stack trace after clicking Details: java.lang.ClassNotFoundException: com.google.jstestdriver.eclipse.ui.views.JsTestDriverView at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:494) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:410) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:398) at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:105) at java.lang.ClassLoader.loadClass(Unknown Source) at org.eclipse.osgi.internal.loader.BundleLoader.loadClass(BundleLoader.java:326) at org.eclipse.osgi.framework.internal.core.BundleHost.loadClass(BundleHost.java:231) at org.eclipse.osgi.framework.internal.core.AbstractBundle.loadClass(AbstractBundle.java:1193) at org.eclipse.core.internal.registry.osgi.RegistryStrategyOSGI.createExecutableExtension(RegistryStrategyOSGI.java:160) at org.eclipse.core.internal.registry.ExtensionRegistry.createExecutableExtension(ExtensionRegistry.java:874) at org.eclipse.core.internal.registry.ConfigurationElement.createExecutableExtension(ConfigurationElement.java:243) at org.eclipse.core.internal.registry.ConfigurationElementHandle.createExecutableExtension(ConfigurationElementHandle.java:51) at org.eclipse.ui.internal.WorkbenchPlugin$1.run(WorkbenchPlugin.java:267) at org.eclipse.swt.custom.BusyIndicator.showWhile(BusyIndicator.java:70) at org.eclipse.ui.internal.WorkbenchPlugin.createExtension(WorkbenchPlugin.java:263) at org.eclipse.ui.internal.registry.ViewDescriptor.createView(ViewDescriptor.java:63) at org.eclipse.ui.internal.ViewReference.createPartHelper(ViewReference.java:324) at org.eclipse.ui.internal.ViewReference.createPart(ViewReference.java:226) at org.eclipse.ui.internal.WorkbenchPartReference.getPart(WorkbenchPartReference.java:595) at org.eclipse.ui.internal.Perspective.showView(Perspective.java:2229) at org.eclipse.ui.internal.WorkbenchPage.busyShowView(WorkbenchPage.java:1067) at org.eclipse.ui.internal.WorkbenchPage$20.run(WorkbenchPage.java:3816) at org.eclipse.swt.custom.BusyIndicator.showWhile(BusyIndicator.java:70) at org.eclipse.ui.internal.WorkbenchPage.showView(WorkbenchPage.java:3813) at org.eclipse.ui.internal.WorkbenchPage.showView(WorkbenchPage.java:3789) at org.eclipse.ui.handlers.ShowViewHandler.openView(ShowViewHandler.java:165) at org.eclipse.ui.handlers.ShowViewHandler.openOther(ShowViewHandler.java:109) at org.eclipse.ui.handlers.ShowViewHandler.execute(ShowViewHandler.java:77) at org.eclipse.ui.internal.handlers.HandlerProxy.execute(HandlerProxy.java:294) at org.eclipse.core.commands.Command.executeWithChecks(Command.java:476) at org.eclipse.core.commands.ParameterizedCommand.executeWithChecks(ParameterizedCommand.java:508) at org.eclipse.ui.internal.handlers.HandlerService.executeCommand(HandlerService.java:169) at org.eclipse.ui.internal.handlers.SlaveHandlerService.executeCommand(SlaveHandlerService.java:241) at org.eclipse.ui.internal.ShowViewMenu$3.run(ShowViewMenu.java:141) at org.eclipse.jface.action.Action.runWithEvent(Action.java:498) at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:584) at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:501) at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:411) at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84) at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1003) at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3880) at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3473) at org.eclipse.ui.internal.Workbench.runEventLoop(Workbench.java:2405) at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2369) at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2221) at org.eclipse.ui.internal.Workbench$5.run(Workbench.java:500) at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332) at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:493) at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149) at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:113) at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:194) at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110) at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79) at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:368) at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:559) at org.eclipse.equinox.launcher.Main.basicRun(Main.java:514) at org.eclipse.equinox.launcher.Main.run(Main.java:1311) Additionally, if I go to the settings in Window-Preferences and try to view the JS Test Driver Preferences, I get the following dialog: Problem Occurred Unable to create the selected preference page. com.google.jstestdriver.eclipse.ui.WorkbenchPreferencePage Thank you, Andrew J. Leer

    Read the article

  • What's the right way to start a node.js service?

    - by elliot42
    I'm running a node.js service (statsd) on CentOS 6. What's the proper way to daemonize and start such a service? Potential Daemonizers--are daemonizers supposed to be language-specific or general?: forever (node-specific) daemonize nohup (presumably wrong) start-stop-daemon(debian-only? is this for daemonizing or starting/stopping? what is the Centos equivalent?) Should the app itself really know how to daemonize itself and then have a -d flag? (e.g. via node-daemonize2 or forever-monitor?) Service starters--should these be from the system/distro, or should they be from monitoring tools such as monit?: service? is really /etc/init.d on CentOS? service? is really Upstart on Ubuntu? monit? daemontools? runit? I'm unfortunately new to this--where can I read up on what is the most standard, classic, reliable way of doing this?

    Read the article

  • Continuous Integration for SQL Server Part II – Integration Testing

    - by Ben Rees
    My previous post, on setting up Continuous Integration for SQL Server databases using GitHub, Bamboo and Red Gate’s tools, covered the first two parts of a simple Database Continuous Delivery process: Putting your database in to a source control system, and, Running a continuous integration process, each time changes are checked in. However there is, of course, a lot more to to Continuous Delivery than that. Specifically, in addition to the above: Putting some actual integration tests in to the CI process (otherwise, they don’t really do much, do they!?), Deploying the database changes with a managed, automated approach, Monitoring what you’ve just put live, to make sure you haven’t broken anything. This post will detail how to set up a very simple pipeline for implementing the first of these (continuous integration testing). NB: A lot of the setup in this post is built on top of the configuration from before, so it might be difficult to implement this post without running through part I first. There’ll then be a third post on automated database deployment followed by a final post dealing with the last item – monitoring changes on the live system. In the previous post, I used a mixture of Red Gate products and other 3rd party software – GitHub and Atlassian Bamboo specifically. This was partly because I believe most people work in an heterogeneous environment, using software from different vendors to suit their purposes and I wanted to show how this could work for this process. For example, you could easily substitute Atlassian’s BitBucket or Stash for GitHub, depending on your needs, or use an alternative CI server such as TeamCity, TFS or Jenkins. However, in this, post, I’ll be mostly using Red Gate products only (other than tSQLt). I would do this, firstly because I work for Red Gate. However, I also think that in the area of Database Delivery processes, nobody else has the offerings to implement this process fully – so I didn’t have any choice!   Background on Continuous Delivery For me, a great source of information on what makes a proper Continuous Delivery process is the Jez Humble and David Farley classic: Continuous Delivery – Reliable Software Releases through Build, Test, and Deployment Automation This book is not of course, primarily about databases, and the process I outline here and in the previous article is a gross simplification of what Jez and David describe (not least because it’s that much harder for databases!). However, a lot of the principles that they describe can be equally applied to database development and, I would argue, should be. As I say however, what I describe here is a very simple version of what would be required for a full production process. A couple of useful resources on handling some of these complexities can be found in the following two references: Refactoring Databases – Evolutionary Database Design, by Scott J Ambler and Pramod J. Sadalage Versioning Databases – Branching and Merging, by Scott Allen In particular, I don’t deal at all with the issues of multiple branches and merging of those branches, an issue made particularly acute by the use of GitHub. The other point worth making is that, in the words of Martin Fowler: Continuous Delivery is about keeping your application in a state where it is always able to deploy into production.   I.e. we are not talking about continuously delivery updates to the production database every time someone checks in an amendment to a stored procedure. That is possible (and what Martin calls Continuous Deployment). However, again, that’s more than I describe in this article. And I doubt I need to remind DBAs or Developers to Proceed with Caution!   Integration Testing Back to something practical. The next stage, building on our set up from the previous article, is to add in some integration tests to the process. As I say, the CI process, though interesting, isn’t enormously useful without some sort of test process running. For this we’ll use the tSQLt framework, an open source framework designed specifically for running SQL Server tests. tSQLt is part of Red Gate’s SQL Test found on http://www.red-gate.com/products/sql-development/sql-test/ or can be downloaded separately from www.tsqlt.org - though I’ll provide a step-by-step guide below for setting this up. Getting tSQLt set up via SQL Test Click on the link http://www.red-gate.com/products/sql-development/sql-test/ and click on the blue Download button to download the Red Gate SQL Test product, if not already installed. Follow the install process for SQL Test to install the SQL Server Management Studio (SSMS) plugin on to your machine, if not already installed. Open SSMS. You should now see SQL Test under the Tools menu:   Clicking this link will give you the basic SQL Test dialogue: As yet, though we’ve installed the SQL Test product we haven’t yet installed the tSQLt test framework on to any particular database. To do this, we need to add our RedGateApp database using this dialogue, by clicking on the + Add Database to SQL Test… link, selecting the RedGateApp database and clicking the Add Database link:   In the next screen, SQL Test describes what will be installed on the database for the tSQLt framework. Also in this dialogue, uncheck the “Add SQL Cop tests” option (shown below). SQL Cop is a great set of pre-defined tests that work within the tSQLt framework to check the general health of your SQL Server database. However, we won’t be using them in this particular simple example: Once you’ve clicked on the OK button, the changes described in the dialogue will be made to your database. Some of these are shown in the left-hand-side below: We’ve now installed the framework. However, we haven’t actually created any tests, so this will be the next step. But, before we proceed, we’ve made an update to our database so should, again check this in to source control, adding comments as required:   Also worth a quick check that your build still runs with the new additions!: (And a quick check of the RedGateAppCI database shows that the changes have been made).   Creating and Testing a Unit Test There are, of course, a lot of very interesting unit tests that you could and should set up for a database. The great thing about the tSQLt framework is that you can write these in SQL. The example I’m going to use here is pretty Mickey Mouse – our database table is going to include some email addresses as reference data and I want to check whether these are all in a correct email format. Nothing clever but it illustrates the process and hopefully shows the method by which more interesting tests could be set up. Adding Reference Data to our Database To start, I want to add some reference data to my database, and have this source controlled (as well as the schema). First of all I need to add some data in to my solitary table – this can be done a number of ways, but I’ll do this in SSMS for simplicity: I then add some reference data to my table: Currently this reference data just exists in the database. For proper integration testing, this needs to form part of the source-controlled version of the database – and so needs to be added to the Git repository. This can be done via SQL Source Control, though first a Primary Key needs to be added to the table. Right click the table, select Design, then right-click on the first “id” row. Then click on “Set Primary Key”: NB: once this change is made, click Save to save the change to the table. Then, to source control this reference data, right click on the table (dbo.Email) and selecting the following option:   In the next screen, link the data in the Email table, by selecting it from the list and clicking “save and close”: We should at this point re-commit the changes (both the addition of the Primary Key, and the data) to the Git repo. NB: From here on, I won’t show screenshots for the GitHub side of things – it’s the same each time: whenever a change is made in SQL Source Control and committed to your local folder, you then need to sync this in the GitHub Windows client (as this is where the build server, Bamboo is taking it from). An interesting point to note here, when these changes are committed in SQL Source Control (right-click database and select “Commit Changes to Source Control..”): The display gives a warning about possibly needing a migration script for the “Add Primary Key” step of the changes. This isn’t actually necessary in this case, but this mechanism would allow you to create override scripts to replace the default change scripts created by the SQL Compare engine (which runs underneath SQL Source Control). Ignoring this message (!), we add a comment and commit the changes to Git. I then sync these, run a build (or the build gets run automatically), and check that the data is being deployed over to the target RedGateAppCI database:   Creating and Running the Test As I mention, the test I’m going to use here is a very simple one - are the email addresses in my reference table valid? This isn’t of course, a full test of email validation (I expect the email addresses I’ve chosen here aren’t really the those of the Fab Four) – but just a very basic check of format used. I’ve taken the relevant SQL from this Stack Overflow article. In SSMS select “SQL Test” from the Tools menu, then click on + New Test: In the next screen, give your new test a name, and also enter a name in the Test Class box (test classes are schemas that help you keep things organised). Also check that the database in which the test is going to be created is correct – RedGateApp in this example: Click “Create Test”. After closing a couple of subsequent dialogues, you’ll see a dummy script for the test, that needs filling in:   We now need to define the SQL for our test. As mentioned before, tSQLt allows you to write your unit tests in T-SQL, and the code I’m going to use here is as below. This needs to be copied and pasted in to the query window, to replace the default given by tSQLt: –  Basic email check test ALTER PROCEDURE [MyChecks].[test Check Email Addresses] AS BEGIN SET NOCOUNT ON         Declare @Output VarChar(max)     Set @Output = ”       SELECT  @Output = @Output + Email +Char(13) + Char(10) FROM dbo.Email WHERE email NOT LIKE ‘%_@__%.__%’       If @Output > ”         Begin             Set @Output = Char(13) + Char(10)                           + @Output             EXEC tSQLt.Fail@Output         End   END;   Once this script is entered, hit execute to add the Stored Procedure to the database. Before committing the test to source control,  it’s worth just checking that it works! For a positive test, click on “SQL Test” from the Tools menu, then click Run Tests. You should see output like the following: - a green tick to indicate success! But of course, what we also need to do is test that this is actually doing something by showing a failed test. Edit one of the email addresses in your table to an incorrect format: Now, re-run the same SQL Test as before and you’ll see the following: Great – we now know that our test is really doing something! You’ll also see a useful error message at the bottom of SSMS: (leave the email address as invalid for now, for the next steps). The next stage is to check this new test in to source control again, by right-clicking on the database and checking in the changes with a commit message (and not forgetting to sync in the GitHub client):   Checking that the Tests are Running as Integration Tests After the changes above are made, and after a build has run on Bamboo (manual or automatic), looking at the Stored Procedures for the RedGateAppCI, the SPROC for the new test has been moved over to the database. However this is not exactly what we were after. We didn’t want to just copy objects from one database to another, but actually run the tests as part of the build/integration test process. I.e. we’re continuously checking any changes we make (in this case, to the reference data emails), to ensure we’re not breaking a test that we’ve set up. The behaviour we want to see is that, if we check in static data that is incorrect (as we did in step 9 above) and we have the tSQLt test set up, then our build in Bamboo should fail. However, re-running the build shows the following: - sadly, a successful build! To make sure the tSQLt tests are run as part of the integration test, we need to amend a switch in the Red Gate CI config file. First, navigate to file sqlCI.targets in your working folder: Edit this document, make the following change, save the document, then commit and sync this change in the GitHub client: <!-- tSQLt tests --> <!-- Optional --> <!-- To run tSQLt tests in source control for the database, enter true. --> <enableTsqlt>true</enableTsqlt> Now, if we re-run the build in Bamboo (NB: I’ve moved to a new server here, hence different address and build number): - superb, a broken build!! The error message isn’t great here, so to get more detailed info, click on the full build log link on this page (below the fold). The interesting part of the log shown is towards the bottom. Pulling out this part:   21-Jun-2013 11:35:19 Build FAILED. 21-Jun-2013 11:35:19 21-Jun-2013 11:35:19 "C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj" (default target) (1) -> 21-Jun-2013 11:35:19 (sqlCI target) -> 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: RedGate.Deploy.SqlServerDbPackage.Shared.Exceptions.InvalidSqlException: Test Case Summary: 1 test case(s) executed, 0 succeeded, 1 failed, 0 errored. [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [MyChecks].[test Check Email Addresses] failed: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: ringo.starr@beatles [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: +----------------------+ [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj] 21-Jun-2013 11:35:19 EXEC : sqlCI error occurred: |Test Execution Summary| [C:\Users\Administrator\bamboo-home\xml-data\build-dir\RGA-RGP-JOB1\sqlCI.proj]   As a final check, we should make sure that, if we now fix this error, the build succeeds. So in SSMS, I’m going to correct the invalid email address, then check this change in to SQL Source Control (with a comment), commit to GitHub, and re-run the build:   This should have fixed the build: It worked! Summary This has been a very quick run through the implementation of CI for databases, including tSQLt tests to test whether your database updates are working. The next post in this series will focus on automated deployment – we’ve tested our database changes, how can we now deploy these to target sites?  

    Read the article

  • Building and Deploying Windows Azure Web Sites using Git and GitHub for Windows

    - by shiju
    Microsoft Windows Azure team has released a new version of Windows Azure which is providing many excellent features. The new Windows Azure provides Web Sites which allows you to deploy up to 10 web sites  for free in a multitenant shared environment and you can easily upgrade this web site to a private, dedicated virtual server when the traffic is grows. The Meet Windows Azure Fact Sheet provides the following information about a Windows Azure Web Site: Windows Azure Web Sites enable developers to easily build and deploy websites with support for multiple frameworks and popular open source applications, including ASP.NET, PHP and Node.js. With just a few clicks, developers can take advantage of Windows Azure’s global scale without having to worry about operations, servers or infrastructure. It is easy to deploy existing sites, if they run on Internet Information Services (IIS) 7, or to build new sites, with a free offer of 10 websites upon signup, with the ability to scale up as needed with reserved instances. Windows Azure Web Sites includes support for the following: Multiple frameworks including ASP.NET, PHP and Node.js Popular open source software apps including WordPress, Joomla!, Drupal, Umbraco and DotNetNuke Windows Azure SQL Database and MySQL databases Multiple types of developer tools and protocols including Visual Studio, Git, FTP, Visual Studio Team Foundation Services and Microsoft WebMatrix Signup to Windows and Enable Azure Web Sites You can signup for a 90 days free trial account in Windows Azure from here. After creating an account in Windows Azure, go to https://account.windowsazure.com/ , and select to preview features to view the available previews. In the Web Sites section of the preview features, click “try it now” which will enables the web sites feature Create Web Site in Windows Azure To create a web sites, login to the Windows Azure portal, and select Web Sites from and click New icon from the left corner  Click WEB SITE, QUICK CREATE and put values for URL and REGION dropdown. You can see the all web sites from the dashboard of the Windows Azure portal Set up Git Publishing Select your web site from the dashboard, and select Set up Git publishing To enable Git publishing , you must give user name and password which will initialize a Git repository Clone Git Repository We can use GitHub for Windows to publish apps to non-GitHub repositories which is well explained by Phil Haack on his blog post. Here we are going to deploy the web site using GitHub for Windows. Let’s clone a Git repository using the Git Url which will be getting from the Windows Azure portal. Let’s copy the Git url and execute the “git clone” with the git url. You can use the Git Shell provided by GitHub for Windows. To get it, right on the GitHub for Windows, and select open shell here as shown in the below picture. When executing the Git Clone command, it will ask for a password where you have to give password which specified in the Windows Azure portal. After cloning the GIT repository, you can drag and drop the local Git repository folder to GitHub for Windows GUI. This will automatically add the Windows Azure Web Site repository onto GitHub for Windows where you can commit your changes and publish your web sites to Windows Azure. Publish the Web Site using GitHub for Windows We can add multiple framework level files including ASP.NET, PHP and Node.js, to the local repository folder can easily publish to Windows Azure from GitHub for Windows GUI. For this demo, let me just add a simple Node.js file named Server.js which handles few request handlers. 1: var http = require('http'); 2: var port=process.env.PORT; 3: var querystring = require('querystring'); 4: var utils = require('util'); 5: var url = require("url"); 6:   7: var server = http.createServer(function(req, res) { 8: switch (req.url) { //checking the request url 9: case '/': 10: homePageHandler (req, res); //handler for home page 11: break; 12: case '/register': 13: registerFormHandler (req, res);//hamdler for register 14: break; 15: default: 16: nofoundHandler (req, res);// handler for 404 not found 17: break; 18: } 19: }); 20: server.listen(port); 21: //function to display the html form 22: function homePageHandler (req, res) { 23: console.log('Request handler home was called.'); 24: res.writeHead(200, {'Content-Type': 'text/html'}); 25: var body = '<html>'+ 26: '<head>'+ 27: '<meta http-equiv="Content-Type" content="text/html; '+ 28: 'charset=UTF-8" />'+ 29: '</head>'+ 30: '<body>'+ 31: '<form action="/register" method="post">'+ 32: 'Name:<input type=text value="" name="name" size=15></br>'+ 33: 'Email:<input type=text value="" name="email" size=15></br>'+ 34: '<input type="submit" value="Submit" />'+ 35: '</form>'+ 36: '</body>'+ 37: '</html>'; 38: //response content 39: res.end(body); 40: } 41: //handler for Post request 42: function registerFormHandler (req, res) { 43: console.log('Request handler register was called.'); 44: var pathname = url.parse(req.url).pathname; 45: console.log("Request for " + pathname + " received."); 46: var postData = ""; 47: req.on('data', function(chunk) { 48: // append the current chunk of data to the postData variable 49: postData += chunk.toString(); 50: }); 51: req.on('end', function() { 52: // doing something with the posted data 53: res.writeHead(200, "OK", {'Content-Type': 'text/html'}); 54: // parse the posted data 55: var decodedBody = querystring.parse(postData); 56: // output the decoded data to the HTTP response 57: res.write('<html><head><title>Post data</title></head><body><pre>'); 58: res.write(utils.inspect(decodedBody)); 59: res.write('</pre></body></html>'); 60: res.end(); 61: }); 62: } 63: //Error handler for 404 no found 64: function nofoundHandler(req, res) { 65: console.log('Request handler nofound was called.'); 66: res.writeHead(404, {'Content-Type': 'text/plain'}); 67: res.end('404 Error - Request handler not found'); 68: } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } If there is any change in the local repository folder, GitHub for Windows will automatically detect the changes. In the above step, we have just added a Server.js file so that GitHub for Windows will detect the changes. Let’s commit the changes to the local repository before publishing the web site to Windows Azure. After committed the all changes, you can click publish button which will publish the all changes to Windows Azure repository. The following screen shot shows deployment history from the Windows Azure portal.   GitHub for Windows is providing a sync button which can use for synchronizing between local repository and Windows Azure repository after making any commit on the local repository after any changes. Our web site is running after the deployment using Git Summary Windows Azure Web Sites lets the developers to easily build and deploy websites with support for multiple framework including ASP.NET, PHP and Node.js and can easily deploy the Web Sites using Visual Studio, Git, FTP, Visual Studio Team Foundation Services and Microsoft WebMatrix. In this demo, we have deployed a Node.js Web Site to Windows Azure using Git. We can use GitHub for Windows to publish apps to non-GitHub repositories and can use to publish Web SItes to Windows Azure.

    Read the article

  • Extremely simple online multiplayer game

    - by Postscripter
    I am considering creating a simple multiplayer game, which focuses on physics and can accommodate up to 30 players per session. Very simple graphics, but smart physics (pushing, weight and gravity, balance) is required. After some research I found a good java script (framework ??) called box2d.js I found the demo to be excellent. this is is kind of physics am looking for in my game. Now, what other frameworks will I need? Node.js?? Prototype.js?? (btw, I found the latest versoin of protoype.js to be released in 2010...?? is this still supported? Should I avoid using it?) What bout HTML 5 and Canvas? would I need them? websockets? Am a beginner in web programming + game programming world. but I will learn fast, am computer science graduate. (but no much web expeience but know essentionals javascript, html, css..). I just need a guiding path to build my game. Thanks

    Read the article

  • TypeError: Object {...} has no method 'find' - when using mongoose with express

    - by sdouble
    I'm having trouble getting data from MongoDB using mongoose schemas with express. I first tested with just mongoose in a single file (mongoosetest.js) and it works fine. But when I start dividing it all up with express routes and config files, things start to break. I'm sure it's something simple, but I've spent the last 3 hours googling and trying to figure out what I'm doing wrong and can't find anything that matches my process enough to compare. mongoosetest.js (this works fine, but not for my application) var mongoose = require('mongoose'); mongoose.connect('mongodb://localhost/meanstack'); var db = mongoose.connection; var userSchema = mongoose.Schema({ name: String }, {collection: 'users'}); var User = mongoose.model('User', userSchema); User.find(function(err, users) { console.log(users); }); These files are where I'm having issues. I'm sure it's something silly, probably a direct result of using external files, exports, and requires. My server.js file just starts up and configures express. I also have a routing file and a db config file. routing file (allRoutes.js) var express = require('express'); var router = express.Router(); var db = require('../config/db'); var User = db.User(); // routes router.get('/user/list', function(req, res) { User.find(function(err, users) { console.log(users); }); }); // catch-all route router.get('*', function(req, res) { res.sendfile('./public/index.html'); }); module.exports = router; dbconfig file (db.js) var mongoose = require('mongoose'); var dbHost = 'localhost'; var dbName = 'meanstack'; var db = mongoose.createConnection(dbHost, dbName); var Schema = mongoose.Schema, ObjectId = Schema.ObjectId; db.once('open', function callback() { console.log('connected'); }); // schemas var User = new Schema({ name : String }, {collection: 'users'}); // models mongoose.model('User', User); var User = mongoose.model('User'); //exports module.exports.User = User; I receive the following error when I browse to localhost:3000/user/list TypeError: Object { _id: 5398bed35473f98c494168a3 } has no method 'find' at Object.module.exports [as handle] (C:\...\routes\allRoutes.js:8:8) at next_layer (C:\...\node_modules\express\lib\router\route.js:103:13) at Route.dispatch (C:\...\node_modules\express\lib\router\route.js:107:5) at C:\...\node_modules\express\lib\router\index.js:213:24 at Function.proto.process_params (C:\...\node_modules\express\lib\router\index.js:284:12) at next (C:\...\node_modules\express\lib\router\index.js:207:19) at Function.proto.handle (C:\...\node_modules\express\lib\router\index.js:154:3) at Layer.router (C:\...\node_modules\express\lib\router\index.js:24:12) at trim_prefix (C:\...\node_modules\express\lib\router\index.js:255:15) at C:\...\node_modules\express\lib\router\index.js:216:9 Like I said, it's probably something silly that I'm messing up with trying to organize my code since my single file (mongoosetest.js) works as expected. Thanks.

    Read the article

  • How to troubleshoot and tweek unreasonably slow wireless connection on Ubuntu?

    - by Leonid
    I've just acquired a USB F5D8053ed Belkin adapter and it is unreasonably slow. Details of how I installed the firmware and device driver is described in this AU Question. I believe there is either a problem with a driver or adapter itself that is preventing from using the full network quality. At the moment I can see that the my Windows laptop is perfroming at 30 x speed better than the Ubuntu desktop PC with Belkin. What are they ways to troubleshoot pure wireless network performance on Ubuntu?

    Read the article

  • Unity won't load with proprietary drivers

    - by Nobita
    First time running Ubuntu 11.04 and getting used to Unity, I decided to install proprietary drivers for my Nvidia graphic card. The output of lspci | grep VGA is: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: nVidia Corporation Device 0df5 (rev a1) If I activate the driver that is "recommended", next time I try to login in a Unity session it just changes to the classic. How can that be happening? I attach the screenshoot of my proprietary driver screen:

    Read the article

  • why my ubuntu always runs on low-grahphic mode?

    - by sam
    My graphic card is NVIDIA GTX 460, and ubuntu is ubuntu 10.04. I have reinstalled graphic driver for two times,but my ubuntu shows that it runs on low-graphic mode after a period of time of driver installation process. The following is the information the computer shows. http://imgur.com/GkZUz Also, when I try to change the graphic setting of NVIDIA on ubuntu, the following information shows. http://imgur.com/SVhCc How to fix it?Thank you!!

    Read the article

  • How to use Unity 3d and Gnome Shell without any drivers?

    - by user49523
    Currently I don't have any driver available for the laptop that I use , and because of this I only can use Unity 2d and Gnome Panel. I have Ubuntu 11.10 and Ubuntu 12.04 installed and no drivers for both. I would like to use Unity 3d and Gnome Shell..instead of Unity and Gnome Panel or at least have that option even if computer became a little slower. Is there a way of enabling Unity 3d and gnome without any driver had been installed?

    Read the article

  • its possible to use unity 3d and gnome3 without any drivers?

    - by user49523
    currently i don't have any driver available for the laptop that i use , and cause of this i only can use unity2d and gnome2. i have ubuntu 11.10 and ubuntu 12.04 installed and no drivers for both. i would like to use unity3d and gnome3..instead of unity and gnome2 or at least have that option even if computer became a lit slower. so there is a way of enabling unity 3d and gnome without any driver had been installed?

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >