Search Results

Search found 29502 results on 1181 pages for 'line segment'.

Page 343/1181 | < Previous Page | 339 340 341 342 343 344 345 346 347 348 349 350  | Next Page >

  • ASP.NET MVC localization DisplayNameAttribute alternatives: a good way

    - by Brian Schroer
    The ASP.NET MVC HTML helper methods like .LabelFor and .EditorFor use model metadata to autogenerate labels for model properties. By default it uses the property name for the label text, but if that’s not appropriate, you can use a DisplayName attribute to specify the desired label text: [DisplayName("Remember me?")] public bool RememberMe { get; set; } I’m working on a multi-language web site, so the labels need to be localized. I tried pointing the DisplayName attribute to a resource string: [DisplayName(MyResource.RememberMe)] public bool RememberMe { get; set; } …but that results in the compiler error "An attribute argument must be a constant expression, typeof expression or array creation expression of an attribute parameter type”. I got around this by creating a custom LocalizedDisplayNameAttribute class that inherits from DisplayNameAttribute: 1: public class LocalizedDisplayNameAttribute : DisplayNameAttribute 2: { 3: public LocalizedDisplayNameAttribute(string resourceKey) 4: { 5: ResourceKey = resourceKey; 6: } 7:   8: public override string DisplayName 9: { 10: get 11: { 12: string displayName = MyResource.ResourceManager.GetString(ResourceKey); 13:   14: return string.IsNullOrEmpty(displayName) 15: ? string.Format("[[{0}]]", ResourceKey) 16: : displayName; 17: } 18: } 19:   20: private string ResourceKey { get; set; } 21: } Instead of a display string, it takes a constructor argument of a resource key. The DisplayName method is overridden to get the display string from the resource file (line 12). If the key is not found, I return a formatted string containing the key (e.g. “[[RememberMe]]”) so I can tell by looking at my web pages which resource keys I haven’t defined yet (line 15). The usage of my custom attribute in the model looks like this: [LocalizedDisplayName("RememberMe")] public bool RememberMe { get; set; } That was my first attempt at localized display names, and it’s a technique that I still use in some cases, but in my next post I’ll talk about the method that I now prefer, a custom DataAnnotationsModelMetadataProvider class…

    Read the article

  • Visual Studio LightSwitch: Yes, these are the droids you&rsquo;re looking for

    - by Jim Duffy
    With all the news and focus on the new features coming in Silverlight 5 I thought I’d take a few minutes to remind folks about the work that Microsoft has done on LightSwitch since the applications created by LightSwitch are Silverlight applications. LightSwitch makes it easier for non-coders to build business applications and easier for coders to maintain them. For those not familiar with LightSwitch, it is a new tool that provides a easier and quicker way for coder and non-coder types alike to create line-of-business applications for the desktop, the web, and the cloud. The target audience for this tool are those power-user types who create Access applications for their organization. While those Access applications fill an immediate need, they typically aren’t very scalable, extendable and/or maintainable by the development staff of the organization. LightSwitch creates applications based on technologies built into Visual Studio thus making it easier for corporate developers to extend and maintain them. LightSwitch is currently in beta but it will ultimately become a new addition to the Visual Studio line of products. Go ahead and download the beta to get a better idea of what the product can do for your organization. The LightSwitch Developer Center contains links to download the beta links to instructional videos links to tutorials links to the LightSwitch Training Kit Another quality resource for LightSwitch information is the Visual Studio LightSwitch Team Blog. My good friend Beth Massi is on the LightSwitch team and has additional valuable content on her blog. Have a day.

    Read the article

  • Redistribution of sqlpackage.exe [SSDT]

    - by jamiet
    This is a short note for anyone that may be interested in redistributing sqlpackage.exe. If this isn’t you then no need to keep reading. Ostensibly this is here for anyone that bingles for this information. sqlpackage.exe is a command-line that ships with SQL Server Development Tools (SSDT) in SQL Server 2012 and its main purpose (amongst other things) is to deploy .dacpac files from the command-line. Its quite conceivable that one might want to install only sqlpackage.exe rather than the full SSDT suite (for example on a production server) and I myself have recently had that need. I enquired to the SSDT product team about the possibility of doing this. I said: Back in VS DB Proj days it was possible to use VSDBCMD.exe on a machine that did not have the full VS shell install by shipping lots of pre-requisites along for the ride (details at How to: Prepare a Database for Deployment From a Command Prompt by Using VSDBCMD.EXE). Is there a similar mechanism for using VSDBMCD.exe’s replacement, sqlpackage.exe? here was the reply from Barclay Hill who heads up the development team: Yes, SQLPackage.exe is the analogy of VSDBCMD.exe. You can acquire separately, in a stand-alone package, by installing DACFX. You can get it from: Feature pack is here: http://www.microsoft.com/en-us/download/details.aspx?id=29065 Web Platform Installer here: http://www.microsoft.com/web/gallery/install.aspx?appid=DACFX You will notice it has dependencies on SQLDOM and SQLCLRTYPES.  WebPI will install these for you, but it is al carte on the feature pack. So, now you know. I didn’t enquire about licensing of DACFX but given SSDT is free I am going to assume that the same applies to DACFX too. @Jamiet

    Read the article

  • How do I roll back to the shipped version of Thunderbird?

    - by kallakafar
    I was using thunderbird v15.0 on ubuntu 12.04 LTS till now, and have the lightning extension installed to manage calendar within thunderbird application. everything was working fine until i decided to update thunderbird to the latest version 16.0 from ubuntu repository. installation was successful, and the profile everything was taken care of perfectly, except that now lightning is not working - it is disabled as lightning v1.7 is NOT compatible with latest thunderbird v16 yet. As a result i am at loss with all my scheduling. now, i would like to go back to thunderbird v15 so that i can use lightning. ubuntu repository only gives TB v16 now. on mozilla site, they are still giving v15 for linux, so i downloaded the tarball and uncompressed using command line. now i have a folder called thunderbird. there are no readme/ configuration files. there are following 'executable files' inside this folder: crashreporter, mozilla-remote-client, plugin-container, thunderbird and thunderbird-bin. i tried invoking thunderbird and thunderbird-bin from command line using sudo, still nothing is opening up. i have execute permissions for this folder contents. i m quite new to linux. please let me know why i m not able to launch thunderbird. did i install it incorrectly? please let me know if i can get a .deb package for TB v15.

    Read the article

  • Redistribution of sqlpackage.exe [SSDT]

    - by jamiet
    This is a short note for anyone that may be interested in redistributing sqlpackage.exe. If this isn’t you then no need to keep reading. Ostensibly this is here for anyone that bingles for this information. sqlpackage.exe is a command-line that ships with SQL Server Development Tools (SSDT) in SQL Server 2012 and its main purpose (amongst other things) is to deploy .dacpac files from the command-line. Its quite conceivable that one might want to install only sqlpackage.exe rather than the full SSDT suite (for example on a production server) and I myself have recently had that need. I enquired to the SSDT product team about the possibility of doing this. I said: Back in VS DB Proj days it was possible to use VSDBCMD.exe on a machine that did not have the full VS shell install by shipping lots of pre-requisites along for the ride (details at How to: Prepare a Database for Deployment From a Command Prompt by Using VSDBCMD.EXE). Is there a similar mechanism for using VSDBMCD.exe’s replacement, sqlpackage.exe? here was the reply from Barclay Hill who heads up the development team: Yes, SQLPackage.exe is the analogy of VSDBCMD.exe. You can acquire separately, in a stand-alone package, by installing DACFX. You can get it from: Feature pack is here: http://www.microsoft.com/en-us/download/details.aspx?id=29065 Web Platform Installer here: http://www.microsoft.com/web/gallery/install.aspx?appid=DACFX You will notice it has dependencies on SQLDOM and SQLCLRTYPES.  WebPI will install these for you, but it is al carte on the feature pack. So, now you know. I didn’t enquire about licensing of DACFX but given SSDT is free I am going to assume that the same applies to DACFX too. @Jamiet

    Read the article

  • I can't add PPA repository behind the proxy (with @ in the username)

    - by kenorb
    I'm trying to add the ppa repository (as a root) with the following command: export HTTP_PROXY="http://[email protected]:[email protected]:8080" add-apt-repository ppa:nilarimogard/webupd8 Traceback (most recent call last): File "/usr/bin/add-apt-repository", line 125, in <module> ppa_info = get_ppa_info_from_lp(user, ppa_name) File "/usr/lib/python2.7/dist-packages/softwareproperties/ppa.py", line 84, in get_ppa_info_from_lp curl.perform() pycurl.error: (56, 'Received HTTP code 407 from proxy after CONNECT') Unfortunately it doesn't work. Looks like curl is connecting to the proxy, but the proxy says that Authentication is Required. I've tried with .curlrc, http_proxy env instead, but it doesn't work. strace -e network,write -s1000 add-apt-repository ppa:nilarimogard/webupd8 socket(PF_INET6, SOCK_DGRAM, IPPROTO_IP) = 4 socket(PF_INET, SOCK_STREAM, IPPROTO_TCP) = 4 connect(4, {sa_family=AF_INET, sin_port=htons(8080), sin_addr=inet_addr("165.x.x.232")}, 16) = -1 EINPROGRESS (Operation now in progress) getsockopt(4, SOL_SOCKET, SO_ERROR, [0], [4]) = 0 getpeername(4, {sa_family=AF_INET, sin_port=htons(8080), sin_addr=inet_addr("165.x.x.232")}, [16]) = 0 getsockname(4, {sa_family=AF_INET, sin_port=htons(46025), sin_addr=inet_addr("161.20.75.220")}, [16]) = 0 sendto(4, "CONNECT launchpad.net:443 HTTP/1.1\r\nHost: launchpad.net:443\r\nUser-Agent: PycURL/7.22.0\r\nProxy-Connection: Keep-Alive\r\nAccept: application/json\r\n\r\n", 146, MSG_NOSIGNAL, NULL, 0) = 146 recvfrom(4, "HTTP/1.1 407 Proxy Authentication Required\r\nProxy-Authenticate: BASIC realm=\"proxy\"\r\nCache-Control: no-cache\r\nPragma: no-cache\r\nContent-Type: text/html; charset=utf-8\r\nProxy-Connection: close\r\nSet-Cookie: BCSI-CS-91b9906520151dad=2; Path=/\r\nConnection: close\ Maybe it's because there is @ sign in the username? Wget works with proxy fine. Related: How do I add a repository from behind a proxy? Environment Ubuntu 12.04 curl 7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 curl Features: GSS-Negotiate IDN IPv6 Largefile NTLM NTLM_WB SSL libz TLS-SRP

    Read the article

  • BIP Debugging to a file

    - by Tim Dexter
    If you use the standalone server or with OBIEE and use OC4J as the web server. Have you ever taken a looksee at the console window (doc/xterm) that you use to start it. Ever turned on debugging to see masses of info flow by that window and want to capture it all? I have been debugging today and watched all that info fly by and on Windoze gets lost before you can see it! The BIP developers use the System.out.println() and System.err.println()methods in the BIP applications to generate debugging formation. Normally the output from these method calls go to the console where the OC4J process is started. However you can specify command line options when starting OC4J to direct the stdout and stderr output directly to files. The ?out and ?err parameters tell OC4J which file to direct the output to. All you need do is modify the oc4j.cmd file used to start BIP. I didnt get fancy and just plugged in the following to the file under the start section. I just modified the line: set CMDARGS=-config "%SERVER_XML%" -userThreads to set CMDARGS=-config "%SERVER_XML%" -out D:\BI\OracleBI\oc4j_bi\j2ee\home\log\oc4j.out -err D:\BI\OracleBI\oc4j_bi\j2ee\home\log\oc4j.err -userThreads Bounced the server and I now have a ballooning pair of debug files that I can pour over to my hearts content. The .out file appears to contain BIP only log info and the .err file, OBIEE messages. If you are using another web server to host BIP, just check out the user docs to find out how to get the log files to write. Note to self, remember to turn off the debug when Im done!

    Read the article

  • Help with a simple incremental backup script

    - by Evan
    I'd like to run the following incomplete script weekly in as a cron job to backup my home directory to an external drive mounted as /mnt/backups #!/bin/bash # TIMEDATE=$(date +%b-%d-%Y-%k:%M) LASTBACKUP=pathToDirWithLastBackup rsync -avr --numeric-ids --link-dest=$LASTBACKUP /home/myfiles /mnt/backups/myfiles$TIMEDATE My first question is how do I correctly set LASTBACKUP to the the the directory in /backs most recently created? Secondly, I'm under the impression that using --link-desk will mean that files in previous backups will not will not copied in later backups if they still exist but will rather symbolically link back to the originally copied files? However, I don't want to retain old files forever. What would be the best way to remove all the backups before a certain date without losing files that may think linked in those backups by currents backups? Basically I'm looking to merge all the files before a certain date to a certain date if that makes more sense than the way I initially framed the question :). Can --link-dest create hard links, and if so, just deleting previous directories wouldn't actually remove linked file? Finally I'd like to add a line to my script that compresses each newly created backup folder (/mnt/backups/myfiles$TIMEDATE). Based on reading this question, I was wondering if I could just use this line gzip --rsyncable /backups/myfiles$TIMEDATE after I run rsync so that sequential rsync --link-dest executions would find already copied and compressed files? I know that's a lot, so many thanks in advance for your help!!

    Read the article

  • Cannot locate packages for Citrix install

    - by Noel Evans
    I'm trying to get a Citrix receiver installed on to Ubuntu 14.04 (64 bit) following Ubuntu's docs. The first line of instructions say to get these required packages: sudo apt-get install libmotif4:i386 nspluginwrapper lib32z1 libc6-i386 libxp6:i386 libxpm4:i386 libasound2:i386 But if I paste in that line, I get this error: Reading state information... Done E: Unable to locate package libmotif4 E: Unable to locate package libxp6 E: Unable to locate package libxpm4 E: Unable to locate package libasound2 My repository settings are below. Is there anything I'm missing in there? Otherwise what do I need to do to install these? $ cat /etc/apt/sources.list deb cdrom:[Ubuntu 14.04 LTS _Trusty Tahr_ - Release amd64 (20140417)]/ precise main restricted deb cdrom:[Ubuntu 14.04 LTS _Trusty Tahr_ - Release amd64 (20140417)]/ trusty main restricted deb-src http://archive.ubuntu.com/ubuntu trusty main restricted #Added by software-properties deb http://archive.ubuntu.com/ubuntu/ trusty main restricted universe multiverse deb-src http://archive.ubuntu.com/ubuntu/ trusty main restricted universe multiverse #Added by software-properties deb http://security.ubuntu.com/ubuntu/ trusty-security main restricted universe multiverse deb-src http://security.ubuntu.com/ubuntu/ trusty-security main restricted universe multiverse #Added by software-properties deb http://archive.ubuntu.com/ubuntu/ trusty-updates main restricted universe multiverse deb-src http://archive.ubuntu.com/ubuntu/ trusty-updates main restricted universe multiverse #Added by software-properties deb http://archive.ubuntu.com/ubuntu/ trusty-proposed main universe restricted multiverse deb http://archive.ubuntu.com/ubuntu/ trusty-backports main universe restricted multiverse

    Read the article

  • JavaCV IplImage to LWJGL Texture

    - by rendrag
    As a side project I've been attempting to make a dynamic display (for example a screen within a game) that shows images from my webcam. I've been messing around with JavaCV and LWJGL for the past few months and have a basic understanding of how they both work. I found this after scouring google, but I get an error that the ByteBuffer isn't big enough. IplImage img = cam.getFrame(); ByteBuffer buffer = img.asByteBuffer(); int textureID = glGenTextures(); //Generate texture ID glBindTexture(GL_TEXTURE_2D, textureID); //Bind texture ID //I don't know how much of the following is necessary //Setup wrap mode glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL12.GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL12.GL_CLAMP_TO_EDGE); //Setup texture scaling filtering glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); //Send texture data to OpenGL - this is the line that actually does stuff and that OpenGL has a problem with glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL12.GL_BGR, GL_UNSIGNED_BYTE, buffer); That last line throws this- Exception in thread "Thread-0" java.lang.IllegalArgumentException: Number of remaining buffer elements is 144, must be at least 921600. Because at most 921600 elements can be returned, a buffer with at least 921600 elements is required, regardless of actual returned element count at org.lwjgl.BufferChecks.throwBufferSizeException(BufferChecks.java:162) at org.lwjgl.BufferChecks.checkBufferSize(BufferChecks.java:189) at org.lwjgl.BufferChecks.checkBuffer(BufferChecks.java:230) at org.lwjgl.opengl.GL11.glTexImage2D(GL11.java:2845) at tests.TextureTest.getTexture(TextureTest.java:78) at tests.TextureTest.update(TextureTest.java:43) at lib.game.AbstractGame$1.run(AbstractGame.java:52) at java.lang.Thread.run(Thread.java:679)

    Read the article

  • Time Travel 101

    - by Jim Duffy
    I’m thinking maybe I should have used Time Crunching 101 as the title instead… or maybe ‘Duh Duffy, where have you been? Everyone knows that!” Ok, so maybe you won’t actually learn how to travel through time from this post but you will learn how to cram more learning into one day. We all know you can’t make it to every conference, every presentation, or every training session. The good news is that many of those events make their content available to either watch online or to download for off-line viewing. The problem is who has time to sit and watch all those presentations in real time? Not me. One trick I use is to view the content at an increased play rate. Why listen to a boring speaker like me drone on for the entire length of the session when you can listen to them drone on in almost half the time. :-) I view nearly all off-line content with Windows Media Player though I’m sure you can implement this idea with any media playback software. The idea is changing the playback speed you view the content at. With Windows Media Player you can change the play speed from the menu system. Once you have the Play Speed Setting panel open you can specify the playback speed. Depending on the content and the presenter I can typically listen between 1.6 and 2.0 times normal speed. My Florida edumacation taught me that playing the video back at twice the speed means I’ll listen to it twice as fast and that means I can view it in almost 1/2 the time.  Too bad it won’t make me twice as smart. :-) I hope this helps you speed your way through more training content. Have a day. :-|

    Read the article

  • unable to load nvidia(bumblebee) in ubuntu 14.04 (only nouveau loads)

    - by Ubuntuser
    Bumblebee stopped working on my system after upgrading to stable version of Ubuntu 14.04. DUring installation I get this error rmmod: ERROR: Module nouveau is in use Setting up bumblebee (3.2.1-90~trustyppa1) ... Selecting 01:00:0 as discrete nvidia card. If this is incorrect, edit the BusID line in /etc/bumblebee/xorg.conf.nouveau . bumblebeed start/running, process 11133 Processing triggers for initramfs-tools (0.103ubuntu4.1) ... update-initramfs: Generating /boot/initrd.img-3.14.1-031401-generic Setting up bumblebee-nvidia (3.2.1-90~trustyppa1) ... Selecting 01:00:0 as discrete nvidia card. If this is incorrect, edit the BusID line in /etc/bumblebee/xorg.conf.nvidia rmmod: ERROR: Module nouveau is in use bumblebeed start/running, process 18284 It says nouveau is in use. I checked the loaded modules lsmod | grep nouveau nouveau 1097199 1 mxm_wmi 13021 1 nouveau ttm 85115 1 nouveau i2c_algo_bit 13413 2 i915,nouveau drm_kms_helper 52758 2 i915,nouveau drm 302817 7 ttm,i915,drm_kms_helper,nouveau wmi 19177 3 dell_wmi,mxm_wmi,nouveau video 19476 2 i915,nouveau However, I have nouveau in my blacklist cat /etc/modprobe.d/blacklist.conf | grep nouveau blacklist nouveau blacklist lbm-nouveau alias nouveau off alias lbm-nouveau off My grub is also set to nomodeset cat /etc/default/grub | grep nomodeset GRUB_CMDLINE_LINUX_DEFAULT="nomodeset quiet splash" My graphics card is nvidia optimus lspci | grep -i vga 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 18) 01:00.0 VGA compatible controller: NVIDIA Corporation GT218M [GeForce 310M] (rev ff) I've raised a bug in launchpad: https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1327598 Note: Nvidia-prime is working for me (partially). Frequent mouse locks. Interestingly, bumblebee works perfectly fine on my fedora 20 partition on this same laptop.

    Read the article

  • unable to install anything that depends upon spamassassin. Cant even install spamassasin

    - by Harbhag
    I am trying to install mailscanner using apt-get install mailscanner and I got the following error Setting up spamassassin (3.3.1-1) ... Starting SpamAssassin Mail Filter Daemon: child process [21344] exited or timed out without signaling production of a PID file: exit 255 at /usr/sbin/spamd line 2588. invoke-rc.d: initscript spamassassin, action "start" failed. dpkg: error processing spamassassin (--configure): subprocess installed post-installation script returned error exit status 255 dpkg: dependency problems prevent configuration of mailscanner: mailscanner depends on spamassassin (>= 3.1); however: Package spamassassin is not configured yet. dpkg: error processing mailscanner (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: spamassassin mailscanner E: Sub-process /usr/bin/dpkg returned an error code (1) and when I tried to install spamassassin I got the following error : Setting up spamassassin (3.3.1-1) ... Starting SpamAssassin Mail Filter Daemon: child process [21389] exited or timed out without signaling production of a PID file: exit 255 at /usr/sbin/spamd line 2588. invoke-rc.d: initscript spamassassin, action "start" failed. dpkg: error processing spamassassin (--configure): subprocess installed post-installation script returned error exit status 255 dpkg: dependency problems prevent configuration of mailscanner: mailscanner depends on spamassassin (>= 3.1); however: Package spamassassin is not configured yet. dpkg: error processing mailscanner (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: spamassassin mailscanner E: Sub-process /usr/bin/dpkg returned an error code (1) I am using Ubuntu Server 10.04

    Read the article

  • How to negotiate with software vendors who do not follow HL7 standards

    - by Peter Turner
    Take, for instance the "", I'd hope that anyone who has spent any time in dealing with HL7 messages knows that the "" signifies that something should be deleted. "" is not an empty string, it's not a filler etc... But occasionally, one may meet a vendor who persists in sending "" instead of just sending nothing at all. Since, I work for a small business and have an extremely flexible HL7 interface, I can ignore ""'s in received messages. But these things are adding up. Some vendors like to send custom formatted fields with psuedo-components that they leave others to interpret themselves. Some vendors send all their information in note segments and assume you're going to only show users the information they send in a monospace font. Some vendors even have the audacity to send Carriage Return Line Feeds at the end of each line of a file interface. Some vendors absolutely refuse to send decimal numbers and in-so-doing refuse to send any numbers. So, with all this crippling humanity against the simple plastic software man, how does one bend without breaking*? Or better yet, how does one fight back and still make money? *my answer is usually to create an interface for the interface and keep the HL7 processing pure, but I don't think this is the best solution

    Read the article

  • Why am I getting this "Connection to PulseAudio failed" error?

    - by Dave M G
    I have a computer that runs Mythbuntu 11.10. It has an external USB Kenwood Digital Audio device. When I open up pavucontrol, I get this message: If I do as the message suggests and run start-pulseaudio-x11, I get this output: $ start-pulseaudio-x11 Connection failure: Connection refused pa_context_connect() failed: Connection refused How do I correct this error? Update: Somewhere during the course of doing the suggested tests in the comments, a new audio device has now become visible in my sound settings. I have not attached or made any new device, so this must be the result of of some setting change. The device I use and know about is the Kenwood Audio device. The "GF108" device will play sound through the Kenwood anyway, but not reliably: Command line output as requested in the comments: $ ls -l ~/.pulse* -rw------- 1 mythbuntu mythbuntu 256 Feb 28 2011 /home/mythbuntu/.pulse-cookie /home/mythbuntu/.pulse: total 200 -rw-r--r-- 1 mythbuntu mythbuntu 8192 Oct 23 01:38 2b98330d36bf53bb85c97fc300000008-card-database.tdb -rw-r--r-- 1 mythbuntu mythbuntu 69 Nov 16 22:51 2b98330d36bf53bb85c97fc300000008-default-sink -rw-r--r-- 1 mythbuntu mythbuntu 68 Nov 16 22:51 2b98330d36bf53bb85c97fc300000008-default-source -rw-r--r-- 1 mythbuntu mythbuntu 49152 Oct 14 12:30 2b98330d36bf53bb85c97fc300000008-device-manager.tdb -rw-r--r-- 1 mythbuntu mythbuntu 61440 Oct 23 01:40 2b98330d36bf53bb85c97fc300000008-device-volumes.tdb lrwxrwxrwx 1 mythbuntu mythbuntu 23 Nov 16 22:50 2b98330d36bf53bb85c97fc300000008-runtime -> /tmp/pulse-EAwvLIQZn7e8 -rw-r--r-- 1 mythbuntu mythbuntu 77824 Nov 1 12:54 2b98330d36bf53bb85c97fc300000008-stream-volumes.tdb And yet more requested command line output: $ ps auxw|grep pulse 1000 2266 0.5 0.2 294184 9152 ? S<l Nov16 4:26 pulseaudio -D 1000 2413 0.0 0.0 94816 3040 ? S Nov16 0:00 /usr/lib/pulseaudio/pulse/gconf-helper 1000 4875 0.0 0.0 8108 908 pts/0 S+ 12:15 0:00 grep --color=auto pulse

    Read the article

  • A Simple Approach For Presenting With Code Samples

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/07/31/a-simple-approach-for-presenting-with-code-samples.aspxI’ve been getting ready for a presentation and have been struggling a bit with the best way to show and execute code samples. I don’t present often (hardly ever), but when I do I like the presentation to have a lot of succinct and executable code snippets to help illustrate the points that I’m making. Depending on what the presentation is about, I might just want to build an entire sample application that I would run during the presentation. In other cases, however, building a full-blown application might not really be the best way to present the code. The presentation I’m working on now is for an open source utility library for dealing with dates and times. I could have probably cooked up a sample app for accepting date and time input and then contrived ways in which it could put the library through its paces, but I had trouble coming up with one app that would illustrate all of the various features of the library that I wanted to highlight. I finally decided that what I really needed was an approach that met the following criteria: Simple: I didn’t want the user interface or overall architecture of a sample application to serve as a distraction from the demonstration of the syntax of the library that the presentation is about. I want to be able to present small bits of code that are focused on accomplishing a single task. Several of these examples will look similar, and that’s OK. I want each sample to “stand on its own” and not rely much on external classes or methods (other than the library that is being presented, of course). “Debuggable” (not really a word, I know): I want to be able to easily run the sample with the debugger attached in Visual Studio should I want to step through any bits of code and show what certain values might be at run time. As far as I know this rules out something like LinqPad, though using LinqPad to present code samples like this is actually a very interesting idea that I might explore another time. Flexible and Selectable: I’m going to have lots of code samples to show, and I want to be able to just package them all up into a single project or module and have an easy way to just run the sample that I want on-demand. Since I’m presenting on a .NET framework library, one of the simplest ways in which I could execute some code samples would be to just create a Console application and use Console.WriteLine to output the pertinent info at run time. This gives me a “no frills” harness from which to run my code samples, and I just hit ‘F5’ to run it with the debugger. This satisfies numbers 1 and 2 from my list of criteria above, but item 3 is a little harder. By default, just running a console application is going to execute the ‘main’ method, and then terminate the program after all code is executed. If I want to have several different code samples and run them one at a time, it would be cumbersome to keep swapping the code I want in and out of the ‘main’ method of the console application. What I really want is an easy way to keep the console app running throughout the whole presentation and just have it run the samples I want when I want. I could setup a simple Windows Forms or WPF desktop application with buttons for the different samples, but then I’m getting away from my first criteria of keeping things as simple as possible. Infinite Loops To The Rescue I found a way to have a simple console application satisfy all three of my requirements above, and it involves using an infinite loop and some Console.ReadLine calls that will give the user an opportunity to break out and exit the program. (All programs that need to run until they are closed explicitly (or crash!) likely use similar constructs behind the scenes. Create a new Windows Forms project, look in the ‘Program.cs’ that gets generated, and then check out the docs for the Application.Run method that it calls.). Here’s how the main method might look: 1: static void Main(string[] args) 2: { 3: do 4: { 5: Console.Write("Enter command or 'exit' to quit: > "); 6: var command = Console.ReadLine(); 7: if ((command ?? string.Empty).Equals("exit", StringComparison.OrdinalIgnoreCase)) 8: { 9: Console.WriteLine("Quitting."); 10: break; 11: } 12: 13: } while (true); 14: } The idea here is the app prompts me for the command I want to run, or I can type in ‘exit’ to break out of the loop and let the application close. The only trick now is to create a set of commands that map to each of the code samples that I’m going to want to run. Each sample is already encapsulated in a single public method in a separate class, so I could just write a big switch statement or create a hashtable/dictionary that maps command text to an Action that will invoke the proper method, but why re-invent the wheel? CLAP For Your Own Presentation I’ve blogged about the CLAP library before, and it turns out that it’s a great fit for satisfying criteria #3 from my list above. CLAP lets you decorate methods in a class with an attribute and then easily invoke those methods from within a console application. CLAP was designed to take the arguments passed into the console app from the command line and parse them to determine which method to run and what arguments to pass to that method, but there’s no reason you can’t re-purpose it to accept command input from within the infinite loop defined above and invoke the corresponding method. Here’s how you might define a couple of different methods to contain two different code samples that you want to run during your presentation: 1: public static class CodeSamples 2: { 3: [Verb(Aliases="one")] 4: public static void SampleOne() 5: { 6: Console.WriteLine("This is sample 1"); 7: } 8:   9: [Verb(Aliases="two")] 10: public static void SampleTwo() 11: { 12: Console.WriteLine("This is sample 2"); 13: } 14: } A couple of things to note about the sample above: I’m using static methods. You don’t actually need to use static methods with CLAP, but the syntax ends up being a bit simpler and static methods happen to lend themselves well to the “one self-contained method per code sample” approach that I want to use. The methods are decorated with a ‘Verb’ attribute. This tells CLAP that they are eligible targets for commands. The “Aliases” argument lets me give them short and easy-to-remember aliases that can be used to invoke them. By default, CLAP just uses the full method name as the command name, but with aliases you can simply the usage a bit. I’m not using any parameters. CLAP’s main feature is its ability to parse out arguments from a command line invocation of a console application and automatically pass them in as parameters to the target methods. My code samples don’t need parameters ,and honestly having them would complicate giving the presentation, so this is a good thing. You could use this same approach to invoke methods with parameters, but you’d have a couple of things to figure out. When you invoke a .NET application from the command line, Windows will parse the arguments and pass them in as a string array (called ‘args’ in the boilerplate console project Program.cs). The parsing that gets done here is smart enough to deal with things like treating strings in double quotes as one argument, and you’d have to re-create that within your infinite loop if you wanted to use parameters. I plan on either submitting a pull request to CLAP to add this capability or maybe just making a small utility class/extension method to do it and posting that here in the future. So I now have a simple class with static methods to contain my code samples, and an infinite loop in my ‘main’ method that can accept text commands. Wiring this all up together is pretty easy: 1: static void Main(string[] args) 2: { 3: do 4: { 5: try 6: { 7: Console.Write("Enter command or 'exit' to quit: > "); 8: var command = Console.ReadLine(); 9: if ((command ?? string.Empty).Equals("exit", StringComparison.OrdinalIgnoreCase)) 10: { 11: Console.WriteLine("Quitting."); 12: break; 13: } 14:   15: Parser.Run<CodeSamples>(new[] { command }); 16: Console.WriteLine("---------------------------------------------------------"); 17: } 18: catch (Exception ex) 19: { 20: Console.Error.WriteLine("Error: " + ex.Message); 21: } 22:   23: } while (true); 24: } Note that I’m now passing the ‘CodeSamples’ class into the CLAP ‘Parser.Run’ as a type argument. This tells CLAP to inspect that class for methods that might be able to handle the commands passed in. I’m also throwing in a little “----“ style line separator and some basic error handling (because I happen to know that some of the samples are going to throw exceptions for demonstration purposes) and I’m good to go. Now during my presentation I can just have the console application running the whole time with the debugger attached and just type in the alias of the code sample method that I want to run when I want to run it.

    Read the article

  • Surface RT: To Be Or Not To Be (Part 1)

    - by smehaffie
    So the Surface RT has been out for 9 months and Microsoft just declared a $900 million dollar write-down. So how did this happen and what does it mean for Microsoft’s efforts to break into the tablet market? I have been thinking a lot about most of the information below since the Surface product line was released. If you are looking for a “Microsoft Is Dead” story, then don’t read any further. But if you want an honest look at what I think led Microsoft to this point and what I think can be done to make Surface RT devices better, then please continue reading. What Led Microsoft To The $900 Million Write-Down Surface Unveiling:Microsoft totally missed the boat when they unveiled the Surface product line on June 18th, 2012. Microsoft should’ve been ready to post the specifications of both devices that night. Microsoft should’ve had a site up and running right after the event so people could pre-order the devices. This would have given them a good idea what the interest was in each device.  They could also have used this data to make a better estimate for the number of units to to have available for the launch and beyond.  They also lost out on taking advantage of the excitement generated by the Surface RT and Surface Pro announcement. They could have thrown in a free touch keyboard to anyone who pre-ordered. The advertising should have started right after the announcement and gotten bigger as launch day approached. Push for as many pre-order as possible and build excitement for the launch. Actual Launch (Surface RT): By this time all excitement was gone from the initial announcement, except for the Micorsoft faithful. Microsoft should have been ready to sell the Surface in as many markets as possible at launch. The limited market release was a real letdown for a lot of people.  A limited release right after the initial announce is understandable, but not at the official launch of the product. Microsoft overpriced the device and now they are lowering it to what it should have been to start with. The $349 price is within the range I suggested it should be at before pricing was announced. (Surface Tablets: The Price Must Be Right). Limited ordering options online was also a killer. User should have been able to buy the base unit of each device and then add on whatever keyboard they wanted to (this applies more to the Surface Pro).  There should have also been a place where users could order any additional add-ins that they wanted to buy (covers, extra power supplies, etc.) Marketing was better and the dancing “Click In” commercial was cool, but the ads comparing the iPad with Siri should have been on the air from day one of the announcement (or at least the launch).  Consumers want to know why you tablet is better, not just that is has a clickable keyboard and built-in kickstand. They could have also compared it to some of the other mid-range tablets if they had not overprices it to begin with. Stock Applications (Mail, People, Calendar, Music, Video, Reader and IE): This is where Microsoft really blew it. They had all the time in the world to make these applications the best of breed and instead we got applications that seemed thrown together.  Some updates have made these application better, but they are all still lacking in features that should have been there from day one. This did not help to enhance a new users experience any. ** I will admit that the applications that were data driven were first class citizen’s and that makes it even more perplexing why MS could knock it out of the park with the Weather, Travel, Finance, Bing, etc.) and fail so miserably on the core applications users would use the most on a tablet. Desktop on Tablet: The desktop just is so out of place on the tablet  I understand it was needed for Office but think it would have been better to not have the desktop in Windows RT, but instead open up the Office applications in full screen mode, in a desktop shell (same goes for  IE11).That way the user wouldn’t realize they are leaving Metro and going to the desktop. The other option would have been to just not include Office on Windows RT devices. Instead they could have made awesome Widows Store Apps for Word, Excel, OneNote and PowerPoint. In addition, they could have made the stock Mail, People, and Calendar applications contain all the functions that Outlook gives desktop users. Having some of the settings in desktop mode and others under “Change PC Settings” made Windows RT seemed unfinished and rushed to market. What Can Be Done To Make Windows RT Based Tablets Better (At least in my opinion) Either eliminate the desktop all together from Windows RT or at least make the user experience better by hiding the fact the user is running Office/IE in the desktop. Personally I ‘d like them to totally get rid of it and just make awesome Windows Store Application version of Word, Excel PowerPoint & OneNote.  This might also make the OS smaller and give the user more available disk space. I doubt there will ever be a Windows Store App versions of Office, but I still think it is a good idea. Make is so users can easily direct their documents, picture, videos and music to their extra storage and can access these files from the standard libraries.  A user should not have to create a VM on their microSD card or create symbolic links to get this to work properly. Most consumers would not be able to do this. Then users get frustrated when they run out or room on their main storage because nothing is automatically save to their microSD card when saved to libraries.  This is a major bug that needs to be fixed, otherwise Microsoft’s selling point of having a microSD slot is worthless. Allows users to uninstall and re-install any of the Office product that come with the Surface. That way people can free up storage space by uninstalling the Office applications they do not need. Everyone’s needs are different, so make the options flexible. Don’t take up storage space for applications the user will not use. Make the Core applications the “Cream of the Crop” Windows App Store applications. The should set the bar for all other Store applications. Improve performance as much as possible, if it seems to be sluggish on a tablet consumer will not buy it. They need to price the next line of Surface product very aggressive to undercut not only iPad but also Android low end tablets (Nook, Kindle Fire, and Nexus, etc.) Give developers incentives to write quality applications for the devices. Don’t reward developers for cranking out cookie cutter, low quality applications. I’d even suggest Microsoft consider implementing some new store certification guideline to stop these type of applications being published. Allow users to easily move the recover disk “partition between their microSD card and main storage. My Predictions for the Surface RT and Windows RT I honestly think even with all the missteps MS has made since the announcement  about the Surface product line, that they are on the right path. I was excited the Surface tablets when they were announced, and I still am. The truth be told, Windows 8 on a tablet (aka: Windows RT) is better than both iOS and Android. My nephew who is an Apple fan boy told me after he saw and used Windows 8 (he got the beta running on his iPad), that Windows 8 kicked Apples butt as a tablet OS. So there is hope for all Windows RT based tablets. I agree with my nephew and that is why whenever anyone asks me about my Surface, I love showing it off and recommend it. The 6 keys to gaining market share in the tablet market are; Aggressive pricing by both Microsoft and their OEM’s Good quality devices put out by Microsoft and their OEM’s (there are some out there, but not enough) Marketing, Marketing, Marketing from both Microsoft and their OEM’s (Need more ads showing why windows based tablets are better than iPads and Android tablets) Getting Widows tablets in retails stores all over, and giving sales people incentive to sell them. Consumers like to try electronics out before they buy them, and most will listen to what the sales person suggest. Microsoft needs sales people in retail stores directing people to buy windows based tablets over iPads and Android tablets. I think the Microsoft Stores within Best Buy is a good start, but they also need to get prominent displays in Walmart, Target, etc.. Release a smaller form factor Surface, Hopefully the 8”-10” next generation Surface is not a rumor. Make “Surface” the brand name for all Microsoft tablets and hybrid devices that they come out with. They cannot change the name with each new release.  Make Surface synonymous with quality, the same way that iPad  is for Apple. Well, that is my 2 cents on the subject. Let me know your thoughts by leaving a comment below. Soon to follow will be my thought on the Surface Pro, so keep an eye out for it. var addthis_pub="smehaffie"; var addthis_options="email, print, digg, slashdot, delicious, twitter, live, myspace, facebook, google, stumbleupon, newsvine";

    Read the article

  • PHP Aspect Oriented Design

    - by Devin Dixon
    This is a continuation of this Code Review question. What was taken away from that post, and other aspect oriented design is it is hard to debug. To counter that, I implemented the ability to turn tracing of the design patterns on. Turning trace on works like: //This can be added anywhere in the code Run::setAdapterTrace(true); Run::setFilterTrace(true); Run::setObserverTrace(true); //Execute the functon echo Run::goForARun(8); In the actual log with the trace turned on, it outputs like so: adapter 2012-02-12 21:46:19 {"type":"closure","object":"static","call_class":"\/public_html\/examples\/design\/ClosureDesigns.php","class":"Run","method":"goForARun","call_method":"goForARun","trace":"Run::goForARun","start_line":68,"end_line":70} filter 2012-02-12 22:05:15 {"type":"closure","event":"return","object":"static","class":"run_filter","method":"\/home\/prodigyview\/public_html\/examples\/design\/ClosureDesigns.php","trace":"Run::goForARun","start_line":51,"end_line":58} observer 2012-02-12 22:05:15 {"type":"closure","object":"static","class":"run_observer","method":"\/home\/prodigyview\/public_html\/public\/examples\/design\/ClosureDesigns.php","trace":"Run::goForARun","start_line":61,"end_line":63} When the information is broken down, the data translates to: Called by an adapter or filter or observer The function called was a closure The location of the closure Class:method the adapter was implemented on The Trace of where the method was called from Start Line and End Line The code has been proven to work in production environments and features various examples of to implement, so the proof of concept is there. It is not DI and accomplishes things that DI cannot. I wouldn't call the code boilerplate but I would call it bloated. In summary, the weaknesses are bloated code and a learning curve in exchange for aspect oriented functionality. Beyond the normal fear of something new and different, what are other weakness in this implementation of aspect oriented design, if any? PS: More examples of AOP here: https://github.com/ProdigyView/ProdigyView/tree/master/examples/design

    Read the article

  • disable all hotkeys with dconf-editor

    - by Gijs
    I'm building a deb package that will create a kiosk user account. When you login to this account, the browser automatically starts and navigates to a pre-given website. Also all the hotkeys should be disabled except for one you defined. Now I'm at the part that the browser starts and my disable script is excecuted on logon but for exaple, I can still close the browser with Alt + F4. And when i check the dconf-editor for the hotkeys, for every single one the bind is deleted, but mine for close isn't added. Like you can see in the printscreen. I disabled all these keys with this code, one line for evere hotkey gsettings set org.gnome.desktop.wm.keybindings begin-resize [] So this seams to work, but at the bottom of my disable script this line should be executed. gsettings set org.gnome.desktop.wm.keybindings close ['<Alt>b'] Does anyone know why i'm able to unbind all these hotkeys in my dconf-editor but they are able to still do their job? And when it is possible to unbind them, why cant I bind mine? I searched the web for a solution but couldn't find one fitting my needs, I hope some of you know the answer to this. Regards Gijs

    Read the article

  • How to change the screen resolution in VNC viewer for Ubuntu 12.04 without a monitor?

    - by user325320
    I have Ubuntu 12.04 installed on a machine and I always use it remotely from VNC. When I have monitor connected to this machine, I can change the resolution of my VNC viewer in the following line: $vnc4server --geometry 1440x900 This worked for me, but I always use this machine remotely, I unplug the monitor and reboot. and the above command line not work anymore. Then I tried xrandr SZ: Pixels Physical Refresh *0 1024 x 768 ( 260mm x 195mm ) *60 Current rotation - normal Current reflection - none Rotations possible - normal Reflections possible - none There is only one option available, so I tried to add a new one. $cvt 1440 900 # 1440x900 59.89 Hz (CVT 1.30MA) hsync: 55.93 kHz; pclk: 106.50 MHz Modeline "1440x900_60.00" 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync $xrandr --newmode "1440x900_60.00" 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync $xrandr --addmode S2 "1440x900_60.00" then I checked with xrandr again and can't see the new mode added. I try to execute the following command and get error says my RandR is too old. $xrandr --output S2 --mode 1440x900_60.00 xrandr: Server RandR version before 1.2 but this does not make sense to me, if I plug in the monitor back and run the xrandr command, it works again! It seems that Ubuntu must conntect to a real monitor before I can change my resolution in my VNC viewer. Can anyone help? UPDATE: Finally I solved this problem by changing to tightvncserver $tightvncserver -geometry 1440x900 works for me. Thanks everything answered my question

    Read the article

  • Can I use Ubuntu One to sync data fiies between two remote computers

    - by Sleepy John
    I've got two computers, both running Ubuntu with files in their home folders sync'd in to Ubuntu One. I'd like to know if it's possible to make Ubuntu One automatically download data changes that have been uploaded automatically to Ubuntu One from one computer to the equivalent data file in the other. Clarifying a bit further, I've installed Red Notebook in both computers and so they each have their own /.rednotebook/data folder containing a series of .txt files corresponding to the monthly entries in each of them. These are sync'd to upload any changes to those .txt files to Ubuntu One. My question is can I, and if so how, do I make Ubuntu One automatically download and replace those .txt files in the other computer after they've been updated and uploaded from the first computer? I did labouriously manage to download all those text files which had been uploaded from the first computer, from Ubuntu One one-by-one to the second computer, but what I want to do is automate this process and that's where I'm stuck. I'm aware that things could get a bit complicated if both my computers were on-line at the same time and both were simultaneously making different Red Notebook entries, so that's not the scenario I'm trying to cover. All I want to achieve is that whatever updates to the files have been uploaded by one computer, will automatically be downloaded to the same-named files in the other computer as soon as that second computer appears on line and detects that Ubuntu One has matching but more recent sync'd files than the ones it's holding.

    Read the article

  • What, if anything, to do about bow-shaped burndowns?

    - by Karl Bielefeldt
    I've started to notice a recurring pattern to our team's burndown charts, which I call a "bowstring" pattern. The ideal line is the "string" and the actual line starts out relatively flat, then curves down to meet the target like a bow. My theory on why they look like this is that toward the beginning of the story, we are doing a lot of debugging or exploratory work that is difficult to estimate remaining work for. Sometimes it even goes up a little as we discover a task is more difficult once we get into it. Then we get into implementation and test which is more predictable, hence the curving down graph. Note I'm not talking about a big scale like BDUF, just the natural short-term constraint that you have to find the bug before you can fix it, coupled with the fact that stories are most likely to start toward the beginning of a two-week iteration. Is this a common occurrence among scrum teams? Do people see it as a problem? If so, what is the root cause and some techniques to deal with it?

    Read the article

  • BizTalk - Removing BAM Activities and Views using bm.exe

    - by Stuart Brierley
    Originally posted on: http://geekswithblogs.net/StuartBrierley/archive/2013/10/16/biztalk---removing-bam-activities-and-views-using-bm.exe.aspxOn the project I am currently working on, we are making quite extensive use of BAM within our growing number of BizTalk applications, all of which are being deployed and undeployed using the excellent Deployment Framework for BizTalk 5.0.Recently I had an issue where problems on the build server had left the target development servers in a state where the BAM activities and views for a particular application were not being removed by the undeploy process and unfortunately the definition in the solution had changed meaning that I could not easily recreate the file from source control.  To get around this I used the bm.exe application from the command line to manually remove the problem BAM artifacts - bm.exe can be found at the following path:C:\Program Files (x86)\Microsoft BizTalk Server 2010\TrackingC:\Program Files (x86)\Microsoft BizTalk Server 2010\TrackingStep1 :Get the BAM Definition FileRun the following command to get the BAm definition file, containing the details of all the activities, views and alerts:bm.exe get-defxml -FileName:{Path and File Name Here}.xmlStep 2: Remove the BAM ArtifactsAt this stage I chose to manually remove each of my problem BAM activities and views using seperate command line calls.  By looking in the definition file I could see the names of the activities and views that I wanted to remove and then use the following commands to remove first the views and then the activities:bm.exe remove-view -name:{viewname}bm.exe remove-activity -name:{activityname}

    Read the article

  • Tools of the Trade

    - by Ajarn Mark Caldwell
    I got pretty excited a couple of days ago when my new laptop arrived. “The new phone books are here!  The new phone books are here!  I’m a somebody!” - Steve Martin in The Jerk It is a Dell Precision M4500 with an Intel i7 Core 2.8 GHZ running 64-bit Windows 7 with a 15.6” widescreen, 8 GB RAM, 256 GB SSD.  For some of you high fliers, this may be nothing to write home about, but compared to the 32–bit Windows XP laptop with 2 GB of RAM and a regular hard disk that I’m coming from, it’s a really nice step forward.  I won’t even bore you with the details of the desktop PC I was first given when I started here 5 1/2 years ago.  Let’s just say that things have improved.  One really nice thing is that while we are definitely running a lean and mean department in terms of staffing, my boss believes in supporting that lean staff with good tools in order to stay lean instead of having to spend even more money on additional employees.  Of course, that only goes so far, and at some point you have to add more people in order to get more work done, which is why we are bringing on-board a new employee and a new contract developer next week.  But that’s a different story for a different time. But the main topic for this post is to highlight the variety of tools that I use in my job and that you might find useful, too.  This is easy to do right now because the process of building up my new laptop from scratch has forced me to assemble a list of software that had to be installed and configured.  Keep in mind as you look through this list that I play many roles in our company.  My official title is Software Engineering Manager, but in addition to managing the team, I am also an active ASP.NET and SQL developer, the Database Administrator, and 50% of the SAN Administrator team.  So, without further ado, here are the tools and some comments about why I use them: Tool Purpose Virtual Clone Drive Easily mount an ISO image as a DVD Drive.  This is particularly handy when you are downloading disk images from Microsoft for your tools. SQL Server 2008 R2 Developer Edition We are migrating all of our active systems to SQL 2008 R2.  Developer Edition has all the features of Enterprise Edition, but intended for development use. SQL Server 2005 Developer Edition (BIDS ONLY) The migration to SSRS 2008 R2 is just getting started, and in the meantime, maintenance work still has to be done on the reports on our SQL 2005 server.  For some reason, you can’t use BIDS from 2008 to write reports for a 2005 server.  There is some different format and when you open 2005 reports in 2008 BIDS, it forces you to upgrade, and they can no longer be uploaded to a 2005 server.  Hopefully Microsoft will fix this soon in some manner similar to Visual Studio now allows you to pick which version of the .NET Framework you are coding against. Visual Studio 2010 Premium All of our application development is in ASP.NET, and we might as well use the tool designed for it. I’ve used a version of Visual Studio going all the way back to VB 6.0 and Visual Interdev. Vault Professional Client Several years ago we replaced Visual Source Safe with SourceGear Vault (then Fortress, and now Vault Pro), and I love it.  It is very reliable with low overhead - perfect for a small to medium size development team.  And being a small ISV, their support is exceptional. Red-Gate Developer Bundle with the SQL Source Control update for Vault I first used, and fell in love with, SQL Prompt shortly before Red-Gate bought it, and then Red-Gate’s first release made me love it even more.  SQL Refactor (which has since been rolled into the latest version of SQL Prompt) has saved me many hours and migraine’s trying to understand somebody else’s code when their indenting was nonexistant, or worse, irrational.  SQL Compare has been awesome for troubleshooting potential schema issues between different instances of system databases.  SQL Data Compare helped us identify the cause behind a bug which appeared in PROD but could not be reproduced in a nearly (but not quite exactly) identical copy in UAT.  And the newest tool we are embracing: SQL Source Control.  I blogged about it here (and here, and here) last December.  This is really going to help us keep each developer’s copy of the database in sync with one another. Fiddler Helps you watch the whole traffic stream on web visits.  Haven’t used it a lot, but it did help me track down some odd 404 errors we were finding in our own application logs.  Has some other JavaScript troubleshooting capabilities, but some of its usefulness has been supplanted by the Developer Tools option in IE8. Funduc Search & Replace Find any string anywhere in a mound of source code really, really fast.  Does RegEx searches, if you understand that foreign language.  Has really helped with some refactoring work to pinpoint, for example, everywhere a particular stored procedure is referenced, whether in .NET code or other SQL procedures (which we have in script files).  Provides in-context preview of the search results.  Fantastic tool, and a bargain price. SciTE SciTE is a Scintilla based Text Editor and it is a fantastic, light-weight tool for quickly reviewing (or writing) program code, SQL scripts, and extract files.  It has language-specific syntax highlighting.  I used it to write several batch and CMD programs a year ago, and to examine data extract files for exchanging information with other systems.  Extremely handy are the options to View End of Line and View Whitespace.  Ever receive a file that is supposed to use CRLF as an end-of-line marker, but really only has CRs?  SciTE will quickly make that visible. Infragistics Controls We do a lot of ASP.NET development, and frequently use the WebGrid, WebTab, and date picker controls.  We will likely be implementing the Hierarchical Data Grid soon.  Infragistics has control suites for WebForms, WinForms, Silverlight, and coming soon MVC/JQuery. WinZip - WITH Command-Line add-in The classic compression program with a great command-line interface that allows me to build those CMD (and soon PowerShell) programs for automated compression jobs.  Our versioned Build packages are zip files. XML Notepad Haven’t used this a lot myself, but one of my team really likes it for examining large XML files. LINQPad Again, haven’t used this one a lot, but it was recommended to me for learning and practicing my LINQ skills which will come in handy as we implement Entity Framework. SQL Sentry Plan Explorer SQL Server Show Plan on steroids.  Great for helping you focus on the parts of a large query that are of most importance.  Also great for just compressing the graphical plan into more readable layout. Araxis Merge A great DIFF and Merge tool.  SourceGear provides a great tool called DiffMerge that we use all the time, but occasionally, I like the cross-edit capabilities of Araxis Merge.  For a while, we also produced DIFF reports in HTML that showed all the changes that occurred between two releases.  This was most important when we were putting out very small, but very important hot fixes on a very politically hot system.  The reports produced by Araxis Merge gave the Director of IS assurance that we were not accidentally introducing ripples throughout the system with our releases. Idera SQL Admin Toolset A great collection of tools including a password checker to help analyze your SQL Server for weak user passwords, a Backup Status tool to quickly scan a large list of servers and databases to identify any that are overdue for backups.  Particularly helpful for highlighting new databases that have been deployed without getting included in your backup processing.  I also like Space Analyzer to keep an eye on disk space consumed by database files. Idera SQL Job Manager This free tool provides a nice calendar view of SQL Server Job Schedules, but to a degree, you also get what you pay for.  We will be purchasing SQL Sentry Event Manager later this year as an even better job schedule reviewer/manager.  But in the meantime, this at least gives me a good view on potential resource conflicts across multiple instances of SQL Server. DBFViewer 2000 I inherited a couple of FoxPro databases that I have to keep an eye on occasionally and have not yet been able to migrate them to SQL Server. Balsamiq Mockups We are still in evaluation-mode on this tool, but I really like it as a quick UI mockup tool that does not require Visual Studio, so someone other than a programmer can do UI design.  The interface looks hand-drawn which definitely has some psychological benefits when communicating to users, too. FeedDemon I have to stay on top of my WAY TOO MANY blog subscriptions somehow.  I may read blogs on a couple of different computers, and FeedDemon’s integration with Google Reader allows me to keep them all in sync.  I don’t particularly like the Google Reader interface, or the fact that it always wanted to mark articles as read just because I scrolled past them.  FeedDemon solves this problem for me, and provides a multi-tabbed interface which is good because fairly frequently one blog will link to something else I want to read, and I can end up with a half-dozen open tabs all from one article. Synergy+ In my office, I run four monitors across two computers all with one mouse and keyboard.  Synergy is the magic software that makes this work. TweetDeck I’m not the most active Tweeter in the world, but when I want to check-in with the Twitterverse, this really helps.  I have found the #sqlhelp and #PoshHelp hash tags particularly useful, and I also have columns setup to make it easy to monitor #sqlpass, #PASSProfDev, and short term events like #sqlsat68.   Whew!  That’s a lot.  No wonder it took me a couple of days to get everything setup the way I wanted it.  Oh, that and actually getting some work accomplished at the same time.  Anyway, I know that is a huge dump of info, and most people never make it here to the end, so for those who did, let me say, CONGRATULATIONS, you made it! I hope you’ll find a new tool or two to make your work life a little easier.

    Read the article

  • DENY select on sys.dm_db_index_physical_stats

    - by steveh99999
    Technorati Tags: security,DMV,permission,sys.dm_db_index_physical_stats I recently saw an interesting blog article by Paul Randal about the performance overhead of querying the sys.dm_db_index_physical_stats. So I was thinking, would it be possible to let non-sysadmin users query DMVs on a SQL server but stop them querying this I/O intensive DMV ? Yes it is, here’s how… 1. Create a new login for test purposes, with permissions to access AdventureWorks database only … CREATE LOGIN [test] WITH PASSWORD='xxxx', DEFAULT_DATABASE=[AdventureWorks] GO USE [AdventureWorks] GO CREATE USER [test] FOR LOGIN [test] WITH DEFAULT_SCHEMA=[dbo] GO 2.login as user test and issue command SELECT  * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks'),NULL,NULL,NULL,'DETAILED') gets error :-  Msg 297, Level 16, State 12, Line 1 The user does not have permission to perform this action. 3.As a sysadmin, issue command :- USE AdventureWorks GRANT VIEW DATABASE STATE TO [test] or GRANT VIEW SERVER STATE TO [test] if all databases can be queried via DMV. 4. Try again as user test to issue command SELECT * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks '),NULL,NULL,NULL,'DETAILED') -- now produces valid results from the DMV.. 5 now create the test user in master database, public role only USE master CREATE USER [test] FOR LOGIN [test] 6 issue command :- USE master DENY SELECT ON sys.dm_db_index_physical_stats TO [test] 7 Now go back to AdventureWorks using test login and try SELECT * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks’),NULL,NULL,NULL,’DETAILED') Now gets error... Msg 229, Level 14, State 5, Line 1 The SELECT permission was denied on the object 'dm_db_index_physical_stats', database 'mssqlsystemresource', schema 'sys'. but the user is still able to query all other non-IO-intensive DMVs. If the user attempts to view the index physical stats via a builtin management studio report  – see recent blog post by Pinal Dave they get an error also

    Read the article

< Previous Page | 339 340 341 342 343 344 345 346 347 348 349 350  | Next Page >