Search Results

Search found 13430 results on 538 pages for 'easy'.

Page 231/538 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • Windows AD, bulk user creation, homedrv creation via commandline

    - by Neil
    I am Bulk creating AD users from the commandline (dsadd) and whilst doing so am setting the homedir and homedrv to a DFS location. I observe when I create the user with all these settings via the GUI (dsa.msc) that the homedrv gets created on the DFS share with all the permissions set correctly. But when using dsadd, the folder is not created. How can I replicate this GUI behaviour via the commandline when creating the user? I don't really want to rely on logon scripts to set it up. Do I have to use mkdir and cacls and something else to give the user Ownership? Or maybe I am missing something easy. Any help much appreciated!

    Read the article

  • When too much encapsulation was reached

    - by Samuel
    Recently, I read a lot of gook articles about how to do a good encapsulation. And when I say "good encapsulation", I don't talk about hiding private fields with public properties; I talk about preventing users of your Api to do wrong things. Here is two good articles about this subject: http://blog.ploeh.dk/2011/05/24/PokayokeDesignFromSmellToFragrance.aspx http://lostechies.com/derickbailey/2011/03/28/encapsulation-youre-doing-it-wrong/ At my job, the majority a our applications are not destined to other programmers but rather to the customers. About 80% of the application code is at the top of the structure (Not used by other code). For this reason, there is probably no chance ever that this code will be used by other application. An example of encapsulation that prevent user to do wrong thing with your Api is to return an IEnumerable instead of IList when you don't want to give the ability to the user to add or remove items in the list. My question is: When encapsulation could be considered like too much of purism object oriented programming while keeping in mind that each hour of programming is charged to the customer? I want to do good code that is maintainable, easy to read and to use but when this is not a public Api (Used by other programmer), where could we put the line between perfect code and not so perfect code? Thank you.

    Read the article

  • Did you forget me?

    - by Ratman21
    I know it has been a long time since I last blogged. Still at it, looking for work in the “IT” field. Had another phone interview (only found out during the interview that it was for one year contract job, but I still would take it) for a Help Desk job. Didn’t get it, they thought I was not a application support person and more of a hardware support. Gee..I started out in “IT” as a programmer. Then a programmer/computer operator, then a Tandem/Lan operator and finally a Network operator. I had to deal with so many different operating systems, software and applications.   And they thought I was too hardware. Well I am working a temp day job with the U.S. Census. It gets me out of the house and out in the country. If find getting paid to check for living quarters not bad job, except for the many houses I find that are up for sale and looks like it was not the owners (former owners it seems) idea, with the kids toys still in the yards. Not good for some one with a over active imagine or for my truck. So far I have backed in to ditch (and had to be pulled out), in to power pole (no damage to pole and very little to truck) and a mail box (no damage to truck but mail box was leaning a little) in the last two weeks.   Oh an I have started reading/using “The Love Dare” book from the movie “Fireproof”. I restarted (yes I have had to go back to day one from day five) the dare this Sunday. Dare one dare/day one “Love Is Patient” and the first dare is (reading from the book is): “The first part of this dare is fairly simple. Although Love is communicated in a number of ways. Our words often reflect the condition of our heart. For the next day, resolve to demonstrate patience and to sys nothing negative to your spouse at all. If the temptation arises, choose not to say anything. It’s better to hold your tongue that to say something you’ll regret. “. This was almost too easy as I can hold back from saying anything bad to any one but, this can also be a problem in life (you hold back for so long and!!!!!!!!!!!!!! Boom). Check back for dare/day two “Love Is Kind”.

    Read the article

  • How to install two packages that write the same file

    - by Joel E Salas
    Sirs, a conundrum. I have two packages that each create /usr/bin/ffprobe. One of them is ffmpeg from the Deb Multimedia repository, while the other is ffmbc 0.7-rc5 built from source. The hand-rolled one is business-critical, and we used to just install it from source wherever it was necessary. I can only assume it would clobber the ffmpeg file, and there were never any ill effects. In theory, it should be acceptable for our ffmbc package to overwrite the file from the ffmpeg package. Is there any easy way to reconcile this?

    Read the article

  • Is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • Tips on combining the right Art Assets with a 2D Skeleton and making it flexible

    - by DevilWithin
    I am on my first attempt to build a skeletal animation system for 2D side-scrollers, so I don't really have experience of what may appear in later stages. So i ask advice from anyone that been there and done that! My approach: I built a Tree structure, the root node is like the center-of-mass of the skeleton, allowing to apply global transformations to the skeleton. Then, i make the hierarchy of the bones, so when moving a leg, the foot also moves. (I also make a Joint, which connects two bones, for utility). I load animations to it from a simple key frame lerp, so it does smooth movement. I map the animation hierarchy to the skeleton, or a part of it, to see if the structure is alike, otherwise the animation doesnt start. I think this is pretty much a standard implementation for such a thing, even if i want to convert it to a Rag Doll on the fly.. Now to my question: Imagine a game like prototype, there is a skeleton animation of the main character, which can animate all meshes in the game that are rigged the same way.. That way the character can transform into anything without any extra code. That is pretty much what i want to do for a side-scroller, in theory it sounds easy, but I want to know if that will work well. If the different people will be decently animated when using the same skeleton-animation pair. I can see this working well with a Stickman, but what about actual humans? Will the perspective look right, or i will need to dynamically change the sprites attached to bones? What do you recommend for such a system?

    Read the article

  • Defense Manpower Data Center Wins Award for Excellence in the Workforce

    - by Peggy Chen
    The Defense Manpower Data Center milConnect website recently won the 2012 Excellence.gov Award for Excellence in the Workforce. Defense Manpower Data Center milConnect is a centralized, online resource that provides military service members (active and retired) and their families (over 42 million in total) quick access to their profile, health care enrollments, benefits, and other military-related topics. An easy to use, safe and secure website, milConnect also provides service members with convenient access their personnel and service-related information. The self-service website allow users to quickly and easily find and, where applicable, update their information in the Defense Eligibility Reporting System (DEERS) and milConnect transmits information to and from one reliable source safely and securely.  The Defense Manpower Data Center (DMDC) maintains the largest, most comprehensive central repository of personnel, manpower, casualty, pay, entitlement, personnel security, person identity and attributes, survey, testing, training, and financial data in the Department of Defense (DoD).  This is one of the largest systems of record in the world. milConnect had the challenge of modernizing the user experience for over 42 million users. With records in over 22 applications and 25 interfaces in hundreds of existing systems, milConnect needed to reduce the complexity of multiple authentication sources as well as consolidating access to existing systems with sensitive information. It accomplished this using a service-orientated architecture, enterprise security and access and identity management for self-service access on a massive scale. By providing 24x7x365 secure access and handling over 5 million transactions daily, not only has milConnect, built on Oracle WebCenter, streamlined and improved the customer experience for military personnel and families. it has also done so while cutting costs, allowing self-service access, and promoting electronic government. Congrats to Defense Manpower Data Center and milConnect! 

    Read the article

  • How can I force the display of image "handles" in Microsoft Word 2010?

    - by Matt
    In order to select images in Microsoft Word documents you need to get the cursor just right so that it turns into the "+" arrow icon, at which point you can click to select the image. When your cursor is not in exactly the right spot you see something like this (note that the letter "m" shown in the picture is an image, not a font): When your cursor is in an appropriate spot you see something like this: For simple images with relatively straight and simple borders, it's easy; you hover over the image and you get the "+" arrow. But for smaller, more intricate images with many sides, thin borders or perhaps transparency it's often madness as you move your cursor all over the image struggling to find the teenie little spot that Word deems is selectable. Is there some means of enabling the display of "handles" (maybe wrong term) around images before you select them, so you can see the selectable spots without hunting and pecking for them?

    Read the article

  • Extract High Quality Icons from Files Using a Free Tool

    - by Lori Kaufman
    If you need to extract an icon from a program file or other type of file (such as .dll files), there are many free tools available that make the task easy. However, very few will extract high quality icon images from the files. Most free icon extraction tools will extract smaller icon image sizes, such as 16×16, 32×32, or 48×48 pixels. Some icons come in larger sizes, such as icons used in Windows. There is a free, small utility, called BeCyIconGrabber, that allows you to view and save icons and cursors of any size contained in .exe, .dll, .icl, .ocx, .cpl, .src, .ico, and .cur files. You can save the extracted icons individually as a .png file, .bmp file, .ico file, or .cur file, or in groups within resource libraries, i.e., .dll or .icl files. BeCyIconGrabber can be downloaded as an installable file or as a portable executable that does not need to be installed. We downloaded the portable file. How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage

    Read the article

  • Attempting to migrate to Aptana from Dreamweaver - how to?

    - by Kerry
    I have been using Dreamweaver for years, and have used Dreamweaver MX, Dreamweaver MX 2004, Dreamweaver 8 & Dreamweaver CS4. They all carry a common theme and were relatively easy to migrate to each other. I use, however, very few features of dreamweaver. Specifically: Syntax highlighting Telesense FTP management/site management The search/replace The idea I get from Aptana is its much more geared toward the web development/programmer side of things, have better visualizations of classes, etc., and handle everything else just as well. The problem is it is not at all clear to me. I managed to start a new project, but it didn't associate it with an FTP. So, I created an FTP, and it looks like I can have many FTPs for one project. My question then is, is there a tutorial or guide to migrating to Aptana from a Dreamweavor's perspective?

    Read the article

  • Large invoice database structure and rendering

    - by user132624
    Our client has a MS SQL database that has 1 million customer invoice records in it. Using the database, our client wants its customers to be able to log into a frontend web site and then be able to view, modify and download their company’s invoices. Given the size of the database and the large number of customers who may log into the web site at any time, we are concerned about data base engine performance and web page invoice rendering performance. The 1 million invoice database is for just 90 days sales, so we will remove invoices over 90 days old from the database. Most of the invoices have multiple line items. We can easily convert our invoices into various data formats so for example it is easy for us to convert to and from SQL to XML with related schema and XSLT. Any data conversion would be done on another server so as not to burden the web interface server. We have tentatively decided to run the web site on a .NET Framework IIS web server using MS SQL on MS Azure. How would you suggest we structure our database for best performance? For example, should we put all the invoices of all customers located within the same 5 digit or 6 digit zip codes into the same table? Or could we set up a separate home directory for each customer on IIS and place each customer’s invoices in each customer’s home directory in XML format? And secondly what would you suggest would be the best method to render customer invoices on a web page and allow customers to modify for best performance? The ADO.net XML Data Set looks intriguing to us as a method, but we have never used it.

    Read the article

  • Final Man vs. Machine Round of Jeopardy Unfolds; Watson Dominates

    - by ETC
    The final round of IBM’s Watson against Ken Jenning and Brad Rutter ended last night with Watson coming out in a strong lead against its two human opponents. Read on to catch a video of the match and see just how quick Watson is on the draw. Watson tore through many of the answers, the little probability bar at the bottom of the screen denoting it was often 95%+ confident in its answers. Some of the more interesting stumbles were, like in the last matches, based on nuance. By far the biggest “What?” moment of the night, however, was when it answered the Daily Double question of “The New Yorker’s 1959 review of this said in its brevity and clarity, it is ‘unlike most such manuals, a book as well as a tool’”. Watson, inexplicably, answered “Dorothy Parker”. You can win them all, eh? Check out the video below to see Watson in action on its final day. Latest Features How-To Geek ETC How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) How To Remove People and Objects From Photographs In Photoshop Ask How-To Geek: How Can I Monitor My Bandwidth Usage? Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware The Citroen GT – An Awesome Video Game Car Brought to Life [Video] Final Man vs. Machine Round of Jeopardy Unfolds; Watson Dominates Give Chromium-Based Browser Desktop Notifications a Native System Look in Ubuntu Chrome Time Track Is a Simple Task Time Tracker Google Sky Map Turns Your Android Phone into a Digital Telescope Walking Through a Seaside Village Wallpaper

    Read the article

  • Grub options are not visible on booting on Samsung ATIV Book 9 Lite running Ubuntu 14.04

    - by mjwittering
    I've managed to install Ubuntu 14.04 on my new Samsung ATIV Book 9 Lite ultrabook. After updating some configuratiosn in the UEFI installation was very easy. The only questions and issue I believe I'm still experience is when booting. I believe when the laptop would be displaying the grub boot options I see the following. There is a black screen with a purple border of 10px around the screen. I'd like to know how I can update my system so that I see the grub boot manager. I've run these commands: sudo cat /etc/default/grub # If you change this file, run 'update-grub' afterwards to update # /boot/grub/grub.cfg. # For full documentation of the options in this file, see: # info -f grub -n 'Simple configuration' GRUB_DEFAULT=0 GRUB_HIDDEN_TIMEOUT=0 GRUB_HIDDEN_TIMEOUT_QUIET=true GRUB_TIMEOUT=10 GRUB_DISTRIBUTOR=`lsb_release -i -s 2> /dev/null || echo Debian` GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" GRUB_CMDLINE_LINUX="" # Uncomment to enable BadRAM filtering, modify to suit your needs # This works with Linux (no patch required) and with any kernel that obtains # the memory map information from GRUB (GNU Mach, kernel of FreeBSD ...) #GRUB_BADRAM="0x01234567,0xfefefefe,0x89abcdef,0xefefefef" # Uncomment to disable graphical terminal (grub-pc only) #GRUB_TERMINAL=console # The resolution used on graphical terminal # note that you can use only modes which your graphic card supports via VBE # you can see them in real GRUB with the command `vbeinfo' #GRUB_GFXMODE=640x480 # Uncomment if you don't want GRUB to pass "root=UUID=xxx" parameter to Linux #GRUB_DISABLE_LINUX_UUID=true # Uncomment to disable generation of recovery mode menu entries #GRUB_DISABLE_RECOVERY="true" # Uncomment to get a beep at grub start #GRUB_INIT_TUNE="480 440 1" The command was not possible, sudo efibootmgr.

    Read the article

  • I Could Sure Use Some Windows Azure Code Samples…

    - by BuckWoody
    There are multiple ways to learn, and one of the most effective is with examples. You have multiple options with Windows Azure, including the Software Development Kit, the Windows Azure Training Kit and now another one…. the Microsoft All-In-One Code Framework,, a free, centralized code sample library driven by developers' real-world pains and needs. The goal is to provide customer-driven code samples for all Microsoft development technologies, and Windows Azure is included. Once you hit the site, you download an EXE that will create a web-app based installer for a Code Browser.   Once inside, you can configure where the samples store data and other settings, and then search for what you want.   You can also request a sample – if enough people ask, we do it. OneCode also partnered with gallery and Visual Studio team to develop this Sample Browser Visual Studio extension.  It’s an easy way for developers to find and download samples from within Visual Studio.  

    Read the article

  • Centos 6.5 -- backported upgrades/php.ini directives included in php 5.3.3

    - by Decave
    PHP 5.3.3 is the latest version of PHP available with the official CentOS 6.5 repos. As most of you know, calling it version '5.3.3' is slightly deceptive because critical bug fixes are actually back ported into version 5.3.3, so in effect 'version 5.3.3' does get upgraded now and then. My question is: aside from manually toggling directives in php.ini, how can you tell which new directives, that were implemented in, and officially supported by, later versions of PHP, are also available in CentOS 6.5's backported PHP 5.3.3? For example, max_input_vars (http://php.net/manual/en/info.configuration.php#ini.max-input-vars) has been available since PHP 5.3.9. IS there an easy way to tell whether CentOS included this in a backported upgrade to 5.3.3? Thanks!

    Read the article

  • How to Transpose Rows and Columns in Excel 2013

    - by Lori Kaufman
    You’ve setup your worksheet with all your row and column headings and you’ve entered all your data. Then, you discover that it would look better if the rows were the columns and the columns were the rows. How do you accomplish this easily? There is an easy way to convert your rows to columns and vice versa using the Transpose feature in Excel. We’ll show you how. Select the cells containing the headings and data you want to transpose. Click Copy or press Ctrl + C. Click in a blank cell on the spreadsheet. This cell will be the top, left corner of the new table of data. Click the down arrow on the Paste button and select Paste Special from the drop-down menu. On the Paste Special dialog box, select the Transpose check box so there is a check mark in the box and click OK. The rows become the columns and the columns become the rows. The original set of data still exists. You can select those cells and delete the headings and data, if desired. Isn’t that a lot easier and faster than retyping all your data?     

    Read the article

  • Announcement: DTrace for Oracle Linux General Availability

    - by Zeynep Koch
    Today we are announcing the general availability of DTrace for Oracle Linux. It is available to download from ULN for Oracle Linux Support customers.  DTrace is a comprehensive dynamic tracing framework that was initially developed for the Oracle Solaris operating system, and is now available to Oracle Linux customers. DTrace is designed to give operational insights that allow users to tune and troubleshoot the operating system. DTrace provides Oracle Linux developers with a tool to analyze performance, and increase observability into the systems they own to see how they work. DTrace enables higher quality applications development, reduced downtime, lower cost, and greater utilization of existing resources. Key benefits and features of DTrace on Oracle Linux include: • Designed to work on finding performance bottlenecks • Dynamically enables the kernel with a number of probe points, improving ability to service software • Enables maximum resource utilization and application performance • Fast and easy to use, even on complex systems with multiple layers of software If you already have Oracle Linux support, you can download DTrace from ULN channel. We have a dedicated Forum for DTrace on Oracle Linux, to discuss your experience and questions.

    Read the article

  • Rate limiting an internet connection per user

    - by Alister
    I've got a friend who has a "rent-by-room" property and includes internet access as part of this. However some tenants are somewhat hogging the internet (i.e. constantly downloading). I was wondering if anyone knows of a fairly easy way of rate limiting each connection to make the system more equitable. A preferred solution would be a cheap piece of hardware or some sort of Linux "appliance". I would rather not have to get an iptables headache if this is avoidable.

    Read the article

  • Structuring database for multi-object "activity" and "following" functionalities

    - by romaninsh
    I am working on a web application which operate with different types of objects such as user, profiles, pages etc. All objects have unique object_id. When objects interact it may produce "activity", such as user posting on the page or profile. Activity may be related to multiple objects through their object_id. Users may also follow "objects" and they need to be able to see stream of relevant activity. Could you provide me with some data structure suggestions which would be efficient and scalable? My goal is to show activity limited to the objects which user is following I am not limited by relational databases. Update As I'm getting advices on ORM and how index things, I'd like to again, stress my question. According to my current design model the database structure looks like this: As you can see - it's quite easy to implement database like that. Activity and Follower tables do contain much larger amount of records than the upper level but it's tolerable. But when it comes for me to create a "timeline" table, it becomes a nightmare. For every user I need to reference all the object activities which he follows. In terms of records it easily gets out of control. Please suggest me how to change this structure to avoid timeline creation and also be abel to quickly retrieve activity for any given user. Thanks.

    Read the article

  • Need for gksudo for "Install new software" in eclipse

    - by Captain Giraffe
    I have been developing for android on eclipse for a while now, and my experience with the eclipse environment on Ubuntu10.10 has not been a smooth one. With the repo install of eclipse I have had to sudo eclipse to install the required components for android development. (a big red flag for me) I tried today to install updates for the eclipse and android platform and my eclipse installation seems to have broken horribly. I can no longer find and of the urls for new software if i gksudo it, if I run it in user mode it fails (as it always has) with permissions problems. I have chowned user:user all my eclipse and android related private/user files. This is a system running ubuntu 10.10 with gnome2.x. On my kubuntu 11.10 install it work a lot better. Is there an easy fix to this? Is the repo version of eclipse broken? Should I do a clean install for just my user? (if so can I retain my previously installed software? the installation process is very time consuming) I saw there was a previous post here recommending this for new installations.

    Read the article

  • Search Work Items for TFS 2010 - New Extension

    - by MikeParks
    A few months ago I was constantly using Visual Studio 2008 with Team Foundation Server 2008. Searching for work items with queries inside Visual Studio became a pain until I found an add in that simplifed it into one little search box in the IDE.  It allowed me to enter some text in, hit the enter key, and it would bring back a list (aka open a .wiq file) of work items that matched the text entered. I became a huge fan of Noah Coad's Search Work Item Add In. He wrote a pretty good blog on how to use it as well. Of course when we upgraded to Visual Studio 2010 and Team Foundation Server 2010, the 2008 add in no longer worked. I didn't see any updates for it on codeplex to be 2010 compatible. Cory Cissell and I have published a few Visual Studio Extensions already so I figured I'd take a shot at making this tool 2010 compatible by turning it into an extension. Sure enough, it worked. We used it locally for a while and recently decided to publish it to the Visual Studio Gallery. If you are currently looking for an easy way to search work items in Visual Studio 2010, this is worth checking out. Big thanks goes out to Noah for originally creating this on codeplex. The extension we created can be downloaded here: http://visualstudiogallery.msdn.microsoft.com/en-us/3f31bfff-5ecb-4e05-8356-04815851b8e7      * Additional note: The default search fields are Title, History, and Description. If you want to modify which work item fields are searchable, type in "--template" (no quotes) into the search box and hit enter. This will open the search template. Just add another "Or" statement, pick the field name, select an operator, type "[search]" (no quotes) in the value field, and hit ctrl + s to save. The next time you run a search it will use the modified search template. That's all for now. Thanks! - Mike

    Read the article

  • First JSRs Proposed for Java EE 7

    - by Jacob Lehrbaum
    With the approval of Java SE 7 and Java SE 8 JSRs last month, attention is now shifting towards the Java EE platform.  While functionality pegged for Java EE 7 was previewed at least as early as Devoxx, the filing of these JSRs marks the first, officially proposed, specifications for the next generation of the popular application server standard.  Let's take a quick look at the proposed new functionality.Java Persistence API 2.1The first of the new proposed specifications is JSR 338: Java Persistence API (JPA) 2.1. JPA is designed for use with both Java EE and Java SE and: "deals with the way relational data is mapped to Java objects ("persistent entities"), the way that these objects are stored in a relational database so that they can be accessed at a later time, and the continued existence of an entity's state even after the application that uses it ends. In addition to simplifying the entity persistence model, the Java Persistence API standardizes object-relational mapping." (more about JPA)JAX-RS 2.0The second of the new Java specifications that have been proposed is JSR 339, otherwise known as JAX-RS 2.0. JAX-RS provides an API that enables the easy creation of web services using the Representational State Transfer (REST) architecture.  Key features proposed in the new JSR include a Client API, improved support for URIs, a Model-View-Controller architecture and much more!More informationOfficial proposal for Java Persistence 2.1 (jcp.org)Official proposal for JAX-RS 2.0 (jcp.org)Kicking off Java EE 7 with 2 JSRs: JAX-RS 2.0 / JPA 2.1 (the Aquarium)

    Read the article

  • I thought everyone did it like this – Training Session Code Management

    - by Fatherjack
    One of an occasional series of blogs about things that I do that perhaps others don’t. From very early on in my dealings with SQL Server Management Studio I started using Solutions and Projects. This means that I started using them when writing sessions and it wasn’t until speaking with someone at PASS Summit 2013 that I found out that this was a process that was unheard of by some people. So, here we go, a run through how I create and manage code and other documents that I use in presentations. For people unsure what solutions and projects are; • Solution – a container for one or more projects. • Project – a container for files, .sql files are grouped as Queries, all other files are stored as Misc. How do I start? Open Management Studio as normal, and then click File | New and select Project This will bring up the New Project dialog box and you can select/add details as necessary in the places indicated. If this is the first project you are creating then be sure to select the Create directory for solution check box (4). If know in advance that you are going to have more than one project in the solution then you may want to edit the Solution name (3) as by default it will take the name of the project that you enter at (2). This will lead you to the following folder structure (depending on the location that you chose in 3) above. In SSMS you need to turn on the Solution Explorer, either via the View menu or pressing Ctrl + Alt + L                   This will bring up a dockable window that will let you quickly access the files that you choose to include in the Solution.                     Can we get to work and write some code yet please? Yes, we can. As with many Microsoft products there are several ways to go about this, let’s look at the easiest way when creating new code. When writing a presentation I usually start from the position we are currently in – a brand new solution and project with no code. Later on we will look at incorporating existing code files into the Project where we need it. Right-click on the Project name and choose Add New Query           As soon as you click this you will be prompted to select the sql server that you want to connect to and once you have done that you will have your new query open in the text editor and the Solution Explorer will now look like this, showing your server connection and your new query.               And the Project folder will look like this         Now once you have written your code don’t press save, choose Save As and give the code a better name than QueryX.sql. SSMS will interpret this as a request to rename Query1 and your Project and the Project folder will show that SQLQuery1.sql no longer exists but there is now a file named as you requested. If you happen to click save in error then right-click the query in the project and choose rename.               You can then alter the name as you like, even when open in the SSMS text editor, and the file will be renamed. When creating a set of scripts for a presentation I name files with a numeric prefix so that when they are sorted by name they are in the order that I need to use them during the session. I love this idea but I’ve got loads of existing scripts I want to put in Projects Excellent, adding existing files to a project is easy, let’s consider that you have query files in your My Documents folder and you want to bring them into the Project we have just created. Right-click on the Project and choose Add | Existing Item           Navigate to the location of your chosen file and select it. The file will open in SSMS text editor and the Project will be updated to show that the selected query is now part of your project. If you look in Windows Explorer you will see that the query file has been copied into the Project folder, the original file still remains in your My Documents (or wherever it existed). I’ll leave it as an exercise for the reader to explore creating further Projects within a solution but will happily answer questions if you get into difficulties. What other advantages do I get from this? Well, as all your code is neatly in one Solution folder and the folder contains only files that are pertinent to the session you are presenting then it makes it very easy to share this code, simply copy the whole folder onto a USB stick, Blog, FTP location, wherever you choose and it’s all there in one self-contained parcel. You don’t have to limit yourself to .sql query files, you can add any sort of document via the Add Existing Item method, just try it out. Right-click on the protect and choose Add | Existing Item           Change the file type filter.                       You can multi select items here using Ctrl as you click each item you want. When you are done, click the Add button and the items will be brought into your project.                 Again, using this process means the files are copied into the project folder, leaving you original files untouched in their original location. Once they are here you can double click them in the SSMS Solution Explorer to open them, for files with a specific file type then the appropriate application will be launched – ie Word, Excel etc. However, if the files are something that the SSMS Text editor can display then they will open in a tab in SSMS. Try it out with a text file or even a PS1 file … This sounds excellent but what do I need to watch out for? One big thing to consider when working like this is the version of SSMS that you are using. There is something fundamentally different between the different versions in the way that the project (.ssmssqlproj) and solution (.sqlsuo and .ssmssln) files are formatted. If you create a solution in an older version of SSMS and then open it in a newer version you will be given the option to upgrade it. Once you do this upgrade then the older version of SSMS will not be able to open the solution any more. Now this ranks as more of an annoyance than disaster as the files within the projects are not affected in any way, you would just have to delete the files mentioned and recreate the solution in the older version again. Summary So, here we have seen how using SSMS Projects and Solutions can help keep related code files (and other document types) together in a neat structure so that they can be quickly navigated during a presentation and it also makes it incredibly simple to distribute your code and share it with others. I hope this is of use to you and helps you bring more order into your sql files, whether you are a person that does technical presentations or not, having your code grouped and managed can make for a lot of advantages as your code library expands.  

    Read the article

  • Remote File Copy - Win Server 2008

    - by Scott
    I'd like to copy backup archives from a remote server to my client machine. In the past, I've installed an FTP server on the remote machine and directed local server backups to dump into that directory. I'd then FTP in from my client machine. Just wondering if there is a simpler way to do this using Win 7 (Client) Win Server 2008? Robocopy? RDC command line options? For example, I can easily remote desktop in and drag the files from the server to my local machine. If there is an easy command line way to do this, then I don't have to setup an FTP server which is ideal. Thanks.

    Read the article

  • Plugging GlusterFS and Openfiler together

    - by lpfavreau
    Has anyone had experience plugging GlusterFS and Openfiler together or something similar? Here is the motivation: Disk space on multiple server regrouped using GlusterFS Centralized access using LDAP/AD and quota management using Openfiler as the GlusterFS client SMB/CIFS server for easy sharing to multiple users on Mac and Windows I know I can have Gluster installed on Openfiler (rPath Linux) successfully but Openfiler seems to be very picky on what it can use as a shared drive. Mounting the Gluster volume inside an existing share does not seem to allow quotas with the mounted folder free space. If this is not possible, is there any alternative to give the same capabilities?

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >