Search Results

Search found 5930 results on 238 pages for 'coding standards'.

Page 189/238 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • change texture at runtime

    - by user1509674
    How can I change the texture at runtime. I have done changing the label(text) at runtime. The following code is used using UnityEngine; using System.Collections; public class switchtime : MonoBehaviour { // Use this for initialization private bool showLabel = false; private bool showLabe2 = false; private bool showLabe3 = false; private bool showLabe4 = false; public void Start() { Invoke("ToggleLabel", 1); Invoke("ToggleLabel2", 3); Invoke("ToggleLabel3",6 ); Invoke("ToggleLabel4", 9); } public void ToggleLabel() { showLabel = !showLabel; } public void ToggleLabel2() { showLabe2 = !showLabe2; } public void ToggleLabel3() { showLabe3 = !showLabe3; } public void ToggleLabel4() { showLabe4 = !showLabe4; } public void OnGUI() { if (showLabel) { GUI.Label(new Rect(300, 200, 100, 20), "Copying window file.."); } if (showLabe2) { GUI.Label(new Rect(40, 40, 100, 20), "Epanding windows file.."); } if (showLabe3) { GUI.Label(new Rect(80, 80, 100, 20), "Installing Feature.."); } if (showLabe4) { GUI.Label(new Rect(100, 100, 100, 20), "Installing Updates"); } } } Now I need to change the GUITexture at runtime. How can do this? Can anybody help me in coding? Here is some changes I made. I worked on changing the gameObject when mouse is placed on it. This is the code I applied and its working for me using UnityEngine; using System.Collections; public class change : MonoBehaviour { // Use this for initialization public GameObject newSprite; private Vector3 currentSpritePosition; void update() { } void Start() { newSprite.renderer.enabled = false; currentSpritePosition = transform.position; //then make it invisible renderer.enabled = false; //give the new sprite the position of the latter newSprite.transform.position = currentSpritePosition; //then make it visible newSprite.renderer.enabled = true; } void OnMouseExit(){ //just the reverse process renderer.enabled = true; newSprite.renderer.enabled = false; } } Now the problem is that, in same code I need to set a time so that the text get loaded without the mouse click. I mean a particular time to load each text one after the another. Here is my screen shot Here I have placed the image one below the another,when the mouse is hovered on it it will change the text from bold to normal text.Changes process stated in (image 2,3).That code I posted is working for me.

    Read the article

  • Enabling syntax highlighting for LESS in Programmer's Notepad?

    - by Cody Gray
    When I don't feel like firing up the Visual Studio behemoth, or when I don't have it installed, I always turn to Programmer's Notepad. It's an amazingly light and fast little text editor, with the special advantage that it is completely platform-native and conforms to standard UI conventions. Therefore, please do not suggest that I consider using other text editors. I've already considered and rejected them because they do not use native UI controls. I like Programmer's Notepad, thank you very much. Unfortunately, I've recently begun to learn, use, and love LESS for all of my CSS coding needs, and it appears that Programmer's Notepad is not bundled with a syntax highlighting scheme for LESS. Does anyone know if there is—by chance and good fortune—one already available somewhere on the web that some kind soul has tediously prepared? If not, how can I go about writing one of my own? Is there a way to build on the existing CSS scheme? It's also possible that any code coloring scheme designed for Scintilla-based editors will work, as Programmer's Notepad is based on the Scintilla control. If you know of a LESS highlighting scheme for Scintilla-based editors, and how to use that with Programmer's Notepad, please suggest that as well.

    Read the article

  • How to configure Hyper-V failover cluster to live migrate when dynamic memory runs out?

    - by Matt Johnson
    Appologies in advance that this is not a direct programming question, but I have a feeling that the solution involves custom powershell scripts (maybe), so this is as good a place to ask as any. I maintain a website that has a large Hyper-V cluster for SQL Servers. We are using Windows 2008 R2 SP1, and the new "dynamic memory" feature. I've already ready reviewed the Best Practices Guide, and implemented it's suggested configuration. Everything works well, except that when SQL demand increases memory pressure to expand to more memory than is available on the physical machine, the memory status goes into the "Warning" state and stays there. I assume the hypervisor is using a swapfile on the host to fulfill the memory requirement, thus slowing the virtual machine down. When this happens, there are plenty of other nodes in the cluster that have available resources. I can live-migrate the virtual server over there and everything works, and the warnings go away. Now how can I automate this? I see no menu options in either Hyper-V or the Failover Cluster Manager for performing a migration or shutdown when dynamic memory goes into the warning state. Any ideas about how to script this, or monitor it and invoke the action directly, would be helpful. If the solution involves coding, powershell would be ideal, but I could envison this as a .Net Service that monitors for this state and kicks off the migration request. I just don't know what objects are involved in doing the monitoring or kicking off the live migration. Thanks in advance.

    Read the article

  • Cant connect to MySQL server from Java application

    - by RN
    This is on VPS\Centos server. The MySQL server is pre configured. I am running the Java application on Tomcat My Java web application is not able to connect to the MySQL server. I get an error - "Caused by: java.net.ConnectException: Connection refused" I suspect this to be a configuration problem rather than a coding problem- hence I have posted this on ServerFault And yes, The same web-app is able to connect to MySQL on a different linux box This is the URL that I provided to my Java application (note- it assumes default port) url = "jdbc:mysql://localhost/pickupgames" My first suspicion was that I am running on a non-default port So I tried to find the port where mySQL server is running I tried every trick mentioned in http://serverfault.com/questions/116100/how-to-check-what-port-mysql-is-running-on But no luck ! SHOW GLOBAL VARIABLES LIKE 'PORT'; This shows port 0 netstat -tlnp doesn't show mysql at all /etc/my.cnf It has no port entry telnet localhost 3306 Doesn't connect And in case you are wondering if mysql server is running at all or not It is And I know for sure, because I have been able to login using the mysql command Also # ps -ef|grep 'mysql' root 31839 27662 0 00:49 pts/3 00:00:00 grep mysql root 32452 1 0 Apr02 ? 00:00:00 /bin/sh /usr/bin/mysqld_safe --skip-grant-tables --skip-networking mysql 32504 32452 0 Apr02 ? 00:00:06 /usr/libexec/mysqld --basedir=/usr --datadir=/var/lib/mysql --user=mysql --pid-file=/var/run/mysqld/mysqld.pid --skip-external-locking --socket=/var/lib/mysql/mysql.sock --skip-grant-tables --skip-networking Please note the --skip-networking parameter Does this have something to do with the issue ? Any explanation why I cant connect to mysql server on port 3306 by telnet? Or why it docent show up under netstat? Any suggestion on whet I should try next ?

    Read the article

  • Window Management for Mac OS X

    - by Paolo Maffei
    Ok, I feel dumb. I've put many hours into this and found nothing, yet. When I was using Windows I had this little tool called WinSplit Revolution. What it did was letting you divide your screen into how many and of how much size you choose "virtual monitors". You set one time of you want to divide your monitor, then everytime WinSplit is opened the monitor is automatically divided into Virtual Monitors. Screenshots: http://www.google.com/images?hl=en&q=winsplit%20revolution&um=1&ie=UTF-8&source=og&sa=N&tab=wi&biw=1045&bih=499 I'm now using a 30' which i want almost always divided into 4 equal size "virtual monitors" (plus my mbp 13' those will be 5 1280x800 virtual monitors) Now I've switched to Mac OS X and can't find anything that does just this efficiently. I tried Divvy but I found no way to divide my screen into arbitrary "virtual monitors", I need a couple of clicks to select a 3x3 space on a 9x9 grid. Before starting coding something like this can you tell me if you already know of some software that does window management like this?

    Read the article

  • How to use a Macro command button in mac excel 2011

    - by user21255
    Im using Mac excel 2011 and I can't seem to get Macro to work. What I am trying to do is that in Worksheet (1st) I am trying to get all of the data entered in the Cases Table at the bottom to all be automatically inserted into the table in the "Cases" worksheet when I click on the "Update" button. But instead I keep getting a pop up saying runtime error and then it asks if I want to End, debug or something else. I just don't know if it is because I am not using Mac Excel correctly as I am used to using windows because I believe my code is correct in the VBA editor to get the button working. Anone who is able to use Mac excel 11 can they check to see if they can use the file provided to see i the button works? If anyone has windows excel then please feel free to check to see if it works on there as well. If it is a coding problem then can you please let me know. My question is simply how to run and stop a Macro in Mac excel 2011. The file can be accessed below: http://ge.tt/76qNwIx/v/0 Thanks

    Read the article

  • How do I create and conveniently search through Libraries in Windows 8?

    - by mtone
    In Windows 7, I took the habit of putting most of my frequently accessed disk areas as Libraries - there were about a dozen. Typing a word in the Start menu would then give me a summary of matches by Library. For example, searching for "WPF" would tell me that I've got some results in the Books library, in the Coding library and a few other PDFs in the Downloads library, one of which I could then expand to see all results within. In Windows 8, that functionality appears to be gone. The Search function in the Charms Bar lists tons of results by type (Documents, Pictures, et cetera) but not by Library. This is practically useless since Documents contains hundreds of .txt and .cs files, a few of which might be Books or Downloads. The only option I found is to go into Explorer and use the search bar in the Library section. However, there again, all search results are mixed together, and I can't seem to find a way to know which Library each result came from (in the Details view, I didn't find a Library column I could add). So, if I want to know which Library contains stuff about a given topic, I have to search the Libraries one by one. Very inconvenient. Is Microsoft slowly deprecating libraries? Any tips? How else can I search through libraries?

    Read the article

  • Security in shared hosting vs VPS 'virtual appliances'

    - by Pedro Loureiro
    I have to change my hosting provider. Right now I have a shared hosting account but I'm considering trying the LAMP stack appliance from turnkeylinux.org. I'm very comfortable with using linux, I've been using it for a long time. I have no problem ssh'ing into remote machines and do whatever I have to do (coding, reading logs, moving files, deploying, etc). The problem is that none of those tasks have involved securing the server/firewall. My experience has been as a desktop user or developer deploying apps/files in remote servers. Ignoring the security in the application logic (read: any scripts, frameworks, websites I might have created or installed) - I'm worried about things like base configuration of deamons, firewall, ports, executable scripts being readable from the outside and whatnot. My question is: how do you compare the (expected) out of the box security of the LAMP stack from turnkey and the (expected) security of a "regular" shared hosting provider? I was hoping to find some guides with a list of steps to do to protect my server but the only documentation I found was simply referring to ubuntu's documentation.

    Read the article

  • PHP output buffer settings ignored by server

    - by Ecom Evolution
    I have been trying to flush the output of certain scripts to the browser on demand, but they do not work on our production server. For instance, I tried running the "Phoca Changing Collation tool" (find it on Google) and I don't see any output until the script finishes executing. I've tried immediately flushing the buffer on other scripts that work fine on any server but this one using the following code: echo "something"; ob_flush(); flush(); Setting "ob_implicit_flush(1);" doesn't help either. The server is Apache 2.2.21 with PHP 5.2.17 running on Linux. You can see our php.ini file here if that will help: http://www.smallfiles.org/download/1123/php.ini.html This isn't the only problem we are having with the server ignoring in-script directives. The server also ignores timeout coding such as: ini_set('max_execution_time', 900*60); AND set_time_limit(86400); Script always times out at the php.ini default. Doesn't seem to matter if script is executed in IE or Firefox. Tried "ini_set('zlib.output_compression_level', 'Off');" and checked that it is "Off" in the php.ini file. The code "apache_setenv('no-gzip', 1);" causes a fatal error so tried uploading a .htaccess file with the "mod_gzip_on No" directive. Neither helps. Tried running Apache as fcgi and suphp, but same results. Server is NOT in safe mode. Pullin ma hair out!

    Read the article

  • Picking a linux compatible motherboard

    - by Chris
    Last time I bought a new computer (I build them myself) I got a motherboard that had really poor linux support for a long time. Specifically the audio. I had to wait months before the kernel supported the on board audio chipset. That is exactly the situation I'm trying to avoid this time around. I have some specific questions about "server motherboards" actually. I looked at a few models of server motherboards by intel, and some random models on newegg. I wasn't able to see much of a difference from regular desktop motherboard other than most had two sockets, and support for much more ram. These boards seem more popular with Linux users. Why? AMD and Intel both have server CPUs as well. Some question, what's the difference? To make this question more concrete, I was looking at this this motherboard. The main questions about it that I can't answer are: Can I get a motherboard without on board raid and audio? I wanted to get a hardware raid controller and a PCI audio card. I thought a server motherboard would be cheaper and not have these "extras", since who wants an audio card on a server? Where can I found out about Linux support for the components on this board? "Intel ICH10R", "Realtek ALC889", "Marvell 88E8056" I'm buying this computer to work as a Linux desktop for a lot of compiling, coding and audio/video work, but I don't want to rule out the possibility of installing windows and playing some games at one point. (even if the last game I got has been sitting in its box unopened for almost a year). Is it a good idea to buy a "server motherboard" and play games on it, or are desktop boards better value for this? The ultimate solution for me would be a motherboard that had GPL divers for onboard LAN, a single CPU socket, lots of PCI express and PCI. USB 3.0, and no fancy hard disk controllers since I'll be getting a separate one.

    Read the article

  • SQL Server 2005 SP3 on Windows 7 - No Management Studio

    - by Mike Thomas
    I've been trying for a day and a half now to get SQL Server 2005, DEV edition, to work on Windows 7, 64 bit prof. I install from the disk, then run SP 3. I get a failure on the Client Components section of the Installation Progress along with this vague message - Product : Client Components Product Version (Previous): 1399 Product Version (Final) : Status : Failure Log File : C:\Program Files\Microsoft SQL Server\90\Setup Bootstrap\LOG\Hotfix\SQLTools9_Hotfix_KB955706_sqlrun_tools.msp.log Error Number : 1712 Error Description : MSP Error: 1712 One or more of the files required to restore your computer to its previous state could not be found. Restoration will not be possible. I've uninstalled all Visual Studio and tried to make this as clean as possible, and have read a lot of the blog posts, but am really at my wits end about this. I am not a DBA, but I use SQL Server all the time when coding and testing apps. Does anyone have any ideas as to where I can get this sorted out? I've been ati this for a long time and have never encountered an installation as bad as this one. Thanks Mike Thomas

    Read the article

  • APACHE - PHP - Bounce emails

    - by user1179459
    I want to improve our mail server lists by handling all the bounces we get in our websites. I have a website which has over 8000+ users and another website which as over 1500+ users, they are emailed various notifications every second, ie. job alerts, email alerts, I am using POP connections with EXIM on APACHE server, most scripts are based on PHP language generates email on the fly. Problems i have But some users has registered long time ago and now quite few has bounce email address Some users register with dummy emails like [email protected] which never existed but a valid email address, any chance of stopping this without asking them login to the email account and clicking links which dont work most times.. (too annoying to the end user) Server is sending unnecessary emails can be avoided if i know they dont exists ? Solutions i need to have Is there a way i can download the bounce email list somewhere (WHM/Cpanel), i know exim mail has it but its not readable (i need a file like CSV or something similar to scan them over and write a php script to delete the users from the database ?) I need to know if there is any function in the PHP that can check the existence of the email address on the fly ? so that i can set the email send function in the mailer class to check before it sends out. On the server will bouncing emails are going to eatup lots of server resources ? like memory/cpu on processing them ? or are they are minimal where we dont have to worry about this at all ? May be a opensoruce or linux software to capture them and view them in a report basis and cleaning them up ? I am not a linux expert or server admin but i do lot of PHP coding, so please be descriptive of the solutions specially if they are linux commands Thank you!

    Read the article

  • User-friendly program to create editable & searchable pdf files, like tax & application forms and su

    - by Nick Gorbikoff
    Hello. Can somebody recommend user-friendly program that will create ( or convert from Excel & Word, or OpenOffice) editable pdf forms. You know like tax forms, that some of us filled out. Where you can create a form with predefined format and stationary, but let user edit/fill out fields. I need something user-friendly that a regular person can use. I'm NOT looking for a pdf library ( I already use wkhtmltopdf for generating pdfs programmaticaly). The reason is that we have about 400 documents ( internal expense forms, traing forms, etc) in .doc and .xls format that we want to convert to editable pdf ( so that people don't have to fill them out by hand). Coding 400 templates and then converting them using some lib or command line tool - is not my idea of fun, espsecially since those form change all the time. I'd like to just give HR and Quality department the tool, so that they can maintain those documents. I looked at everything listed on this page ( http://www.cogniview.com/convert-pdf-to-excel/post/pdf-editing-creation-50-open-sourcefree-alternatives-to-adobe-acrobat/ ), but can't find what I need. Thank you!!!

    Read the article

  • Building a PC for Work and play? [closed]

    - by Derek Organ
    Ok, Its been a long time since I build my own PC so I'm looking to get back into it again and build a new one. First off budget is about €800 excluding the monitor and windows 7 licence and mouse. (just bought a new g500) I plan on using my computer for work, lots of applications open at once but none particularly excessive (photoshop being the most demanding, mostly coding tools) I also use it for some gaming, e.g. COD, Starcraft etc. One thing I do want to do eventually is get a really good monitor with hight resolution and maybe 27" so the graphics card needs to be able to make best use of that. So a few questions 1) Is the bottle neck in performance mostly still the harddrives? 2) Aren't most processors e.g. i5 etc even i3 so far a head of other bottlenecks it makes litte difference the higher you go. Isn't the Graphics card dealing with heavy graphics so what really slows because of a slow CPU? So from this my thinking is to get a SSD drive as my primary drive for OS etc and have loads of memory e.g. 6-8GB and a decent mid level graphics card? It doesn't seem at my level worth spending much on CPU and any other parts really. I basic parts off the top of my head Case, Motherboard CPU SSD Drive SATA Drive Power Supply Memory Cooling (fan?) Graphics Card Network Card Keyboard DVD drive Mouse Windows Monitor Am I missing anything? Any helpful tips or general education much appreciated. Thanks, Derek

    Read the article

  • emails not sending from CentOS 5.6 VM on Win7 via PHP code

    - by crmpicco
    I am experiencing an issue where my CentOS 5.6 (Final) VM running on Windows 7 has stopped sending emails from my PHP code. I'm confident this isn't a coding issue as I have the exact same code running in my office and emails send correctly from there, hence why I believe this to be a networking/configuration issue. In my /etc/hosts/ file on my VM I have the following: 127.0.0.1 localhost.localdomain localhost 192.168.0.9 crmpicco.co.uk m.crmpicco.co.uk dev53.localdomain When I run setup on my VM the DNS configuration is set to dev53.localdomain and my Primary DNS is 192.168.0.1. In My /var/log/maillog files I see a lot of this sort of thing: Nov 19 14:36:58 dev53 sendmail[21696]: qAJEawI7021696: from=<[email protected]>, size=12858, class=0, nrcpts=1, msgid=<1353335817.9103820024efb30b451d006dc4ab3370@PHPMAILSERVER>, proto=ESMTP, daemon=MTA, relay=localhost.localdomain [127.0.0.1] Nov 19 14:36:58 dev53 sendmail[21693]: qAJEawvd021693: [email protected], [email protected] (48/48), delay=00:00:00, xdelay=00:00:00, mailer=relay, pri=42681, relay=[127.0.0.1] [127.0.0.1], dsn=2.0.0, stat=Sent (qAJEawI7021696 Message accepted for delivery) Nov 19 14:36:59 dev53 sendmail[21698]: qAJEawI7021696: to=<[email protected]>, delay=00:00:01, xdelay=00:00:01, mailer=esmtp, pri=132858, relay=mailserver.fletcher.co.uk. [213.171.216.114], dsn=5.0.0, stat=Service unavailable Is this likely to be a configuration issue?

    Read the article

  • How to add a privilege to an account in Windows?

    - by mark
    Given: A VM running Windows 2008 I am logged on there using my domain account (SHUNRANET\markk) I have added the "Create global objects" privilege to my domain account: The VM is restarted (I know logout/logon is enough, but I had to restart) I logon again using the same domain account. It seems still to have the privilege: I run some process and examine its Security properties using the Process Explorer. The account does not seem to have the privilege: This is not an idle curiousity. I have a real problem, that without this privilege the named pipe WCF binding works neither on Windows 2008 nor on Windows 7! Here is an interesting discussion on this matter - http://social.msdn.microsoft.com/forums/en-US/wcf/thread/b71cfd4d-3e7f-4d76-9561-1e6070414620. Does anyone know how to make this work? Thanks. EDIT BTW, when I run the process elevated, everything is fine and the process explorer does display the privilege as expected: But I do not want to run it elevated. EDIT2 I equally welcome any solution. Be it configuration only or mixed with code. EDIT3 I have posted the same question on MSDN forums and they have redirected me to this page - http://support.microsoft.com/default.aspx?scid=kb;EN-US;132958. I am yet to determine the relevance of it, but it looks promising. Notice also that it is a completely coding solution that they propose, so whoever moved this post to the ServerFault - please reinstate it back in the StackOverflow.

    Read the article

  • Backing up Excel Files to a different Directory

    - by Joe Taylor
    In Excel 2007 in the Save As box there is an option to 'Create a Backup' which simply backs up the file whenever it is saved. Unfortunately it backs up the file to the same directory as the original. Is there a simple way to change this directory to another drive / folder? I have messed about with macros to do this, coming up with: Private Sub Workbook_BeforeClose(Cancel As Boolean) 'Private Sub Workbook_BeforeSave(ByVal SaveAsUI As Boolean, Cancel As Boolean) 'Saves the current file to a backup folder and the default folder 'Note that any backup is overwritten Application.DisplayAlerts = False ActiveWorkbook.SaveCopyAs Filename:="T:\TEC_SERV\Backup file folder - DO NOT DELETE\" & _ ActiveWorkbook.Name ActiveWorkbook.Save Application.DisplayAlerts = True End Sub This creates a backup of the file ok the first time, however if this is tried again I get: Run-Time Error '1004'; Microsoft Office Excel cannot access the file 'T:\TEC_SERV\Backup file folder - DO NOT DELETE\Test Macro Sheet.xlsm. There are several possible reasons: The file name or path does not exist The file is being used by another program The workbook you are trying to save has the same name as a... I know the path is correct, I also know that the file is not open anywhere else. The workbook has the same name as the one I'm trying to save over but it should just overwrite. I have posted the question about the coding on Stack Overflow but wondered if there is an easier way to do this. Any help would be much appreciated. Joe

    Read the article

  • Autohotkey script multiple functions

    - by Vince
    Is it possible to use this bottom script but add a second hotkey and function that goes with it. ;DoOver.ini ;[Settings] ;record={LCtrl}{F12} ;hotkey to start and stop recording ;playback={LCtrl}{F5} ;hotkey to start playback ;keydelay=10 ;ms to wait after sending a keypress ;windelay=100 ;ms to wait after activating a window ;movemouseafter=1 ;move the mouse to original pos after playback 1=yes 0=no [Settings] record={LCtrl}{F12} playback={LCtrl}{F5} keydelay=10 windelay=100 movemouseafter=1 macro={WinActive}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{LCTRL Down}{Right}{Right}{LCTRL Up}{LSHIFT Down}{End}{LSHIFT Up}{LCTRL Down}{c}{LCTRL Up}{MouseClick,L,236,116,1,0,D}{MouseClick,L,54,116,1,0,U}{LCTRL Down}{LCTRL Up}{MouseClick,L,474,64,1,0,D}{MouseClick,L,474,64,1,0,U}{MouseClick,L,451,77,1,0,D}{MouseClick,L,451,77,1,0,U}{MouseClick,L,44,225,1,0,D}{MouseClick,L,44,225,1,0,U} OR playback={LCtrl}{F7} keydelay=10 windelay=100 movemouseafter=1 macro={WinActive}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{Down}{LCTRL Down}{Right}{Right}{LCTRL Up}{LSHIFT Down}{End}{LSHIFT Up}{LCTRL Down}{c}{LCTRL Up}{MouseClick,L,236,116,1,0,D}{MouseClick,L,54,116,1,0,U}{LCTRL Down}{LCTRL Up}{MouseClick,L,474,64,1,0,D}{MouseClick,L,474,64,1,0,U}{MouseClick,L,451,77,1,0,D}{MouseClick,L,451,77,1,0,U}{MouseClick,L,44,225,1,0,D}{MouseClick,L,44,225,1,0,U} Maybe add something like what is printed in bold here. I know the coding isnt right here, but i think this is the best way to describe what I am looking for. Anybody?

    Read the article

  • How to read cell data in excel and output to command prompt

    - by Max Ollerenshaw
    Hi All, I'm a sys admin and I am trying to learn how to use powershell... I have never done any type of scripting or coding before and I have been teaching myself online by learning from the technet script centre and online forums. What I am trying to accomplish is to open an excel spreadsheet get information from it (usernames and password) and then output it into the command prompt in powershell. When ever I try to do this I get an Exception calling "InvokeMember" anyway, here is the code I have so far: function Invoke([object]$m, [string]$method, $parameters) { $m.PSBase.GetType().InvokeMember( $method, [Reflection.BindingFlags]::InvokeMethod, $null, $m, $parameters,$ciUS ) } $ciUS = [System.Globalization.CultureInfo]'en-US' $objExcel = New-Object -comobject Excel.Application $objExcel.Visible = $False $objExcel.DisplayAlerts = $False $objWorkbook = Invoke $objExcel.Workbooks.Open "C:\PS\User Data.xls" Write-Host "Numer of worksheets: " $objWorkbook.Sheets.Count $objWorksheet = $objWorkbook.Worksheets.Item(1) Write-Host "Worksheet: " $objWorksheet.Name $Forename = $objWorksheet.Cells.Item(2,1).Text $Surname = $objWorksheet.Cells.Item(2,2).Text Write-Host "Forename: " $Forename Write-Host "Surname: " $Surname $objExcel.Quit() If (ps excel) { kill -name excel} I have read many different posts on forums and articles on how to try and get around the en-US problem but I cannot seem to get around it and hope that someone here can help! Here is the Exeption problem I mentioned: Exception calling "InvokeMember" with "6" argument(s): "Method 'System.Management.Automation.PSMethod.C:\PS\User Data.x ls' not found." At C:\PS\excel.ps1:3 char:33 + $m.PSBase.GetType().InvokeMember <<<< ( + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : DotNetMethodException Numer of worksheets: You cannot call a method on a null-valued expression. At C:\PS\excel.ps1:18 char:45 + $objWorksheet = $objWorkbook.Worksheets.Item <<<< (1) + CategoryInfo : InvalidOperation: (Item:String) [], RuntimeException + FullyQualifiedErrorId : InvokeMethodOnNull Worksheet: You cannot call a method on a null-valued expression. At C:\PS\excel.ps1:21 char:37 + $Forename = $objWorksheet.Cells.Item <<<< (2,1).Text + CategoryInfo : InvalidOperation: (Item:String) [], RuntimeException + FullyQualifiedErrorId : InvokeMethodOnNull You cannot call a method on a null-valued expression. At C:\PS\excel.ps1:22 char:36 + $Surname = $objWorksheet.Cells.Item <<<< (2,2).Text + CategoryInfo : InvalidOperation: (Item:String) [], RuntimeException + FullyQualifiedErrorId : InvokeMethodOnNull Forename: Surname: This is the first question I have ever asked, try to be nice! :)) Many Thanks Max

    Read the article

  • Conditionally Rewrite Email Headers (From & Reply-To) Exchange 2010

    - by NorthVandea
    I have a client who maintains Company A (with email addresses %username%@companyA.com) and they own the domain companyB.com however there is no "infrastructure" (no Exchange server) set up specifically for companyB.com. My client needs to be able to have the end users within it's company (companyA.com) add a specific word or phrase to the Subject (or Body) line of the Outgoing email (they are only concerned with outgoing, incoming is a non-issue in this case) that triggers the Exchange 2010 servers to rewrite the header From and Reply-To [email protected] with [email protected] but this re-write should ONLY occur if the user places the key word/phrase in the Subject (or Body). I have attempted using Transport Rules and the New-AddressRewriteEntry cmdlet however each seems to have a limitation. From what I can tell Transport Rules cannot re-write the From/Reply-To fields and New-AddressRewriteEntry cannot be conditionally triggered based on message content. So to recap: User sends email outside the organization: From and Reply-To remain [email protected] User sends email outside the organization WITH "KeyWord" in the Subject or Body: From and Reply-To change to [email protected] automatically. Anyone know how this could be done WITHOUT coding a new Mail Agent? I don't have the programming knowledge to code a custom Agent... I can use any function of Exchange Management Shell or Console. Alternatively if anyone knows of a simple add-on program that could do this that would be good too. Any help would be greatly appreciated! Thank you!!!

    Read the article

  • picking a linux compatable motherboard

    - by Chris
    Last time I bought a new computer (I build them myself) I got a motherboard that had really poor linux support for a long time. Specifically the audio. I had to wait months before the kernel supported the on board audio chipset. That is exactly the situation I'm trying to avoid this time around. I have some specific questions about "server motherboards" actually. I looked at a few models of server motherboards by intel, and some random models on newegg. I wasn't able to see much of a difference from regular desktop motherboard other than most had two sockets, and support for much more ram. These boards seem more popular with Linux users. Why? AMD and Intel both have server CPUs as well. Some question, what's the difference? To make this question more concrete, I was looking at this this motherboard. The main questions about it that I can't answer are: Can I get a motherboard without on board raid and audio? I wanted to get a hardware raid controller and a PCI audio card. I thought a server motherboard would be cheaper and not have these "extras", since who wants an audio card on a server? Where can I found out about Linux support for the components on this board? "Intel ICH10R", "Realtek ALC889", "Marvell 88E8056" I'm buying this computer to work as a Linux desktop for a lot of compiling, coding and audio/video work, but I don't want to rule out the possibility of installing windows and playing some games at one point. (even if the last game I got has been sitting in its box unopened for almost a year). Is it a good idea to buy a "server motherboard" and play games on it, or are desktop boards better value for this? The ultimate solution for me would be a motherboard that had GPL divers for onboard LAN, a single CPU socket, lots of PCI express and PCI. USB 3.0, and no fancy hard disk controllers since I'll be getting a separate one.

    Read the article

  • Anyone know a good mind mapper that works with a scheduler?

    - by GLycan
    TL;DR: Mind mapping tasks to be processed into a schedule based on task metadata. I have all sorts of ideas about what to invest resources (mainly time) in, but when I actually have time to do something I useually end up browsing reddit for not knowing what do to, and the frequancy with which I forget deadlines scares me. I'd love to bring order and structure into my mind, and always know what to do next. So, I want a mind mapping app, where I'd give each branch (types and subtypes of things I want to do) a importance score (if there were two branches, and one had 60 while the other 40, they would respectivily get 60% and 40% of the parent's importance, with the root being 100) and a how soon that branch should be revised/updated (an hobby I want to try out might be checked, say, once a week, while a school subject should be checked once a day) and give each leaf (something I want/need to do) how much time it takes, deadline (if any), and optionally an absolute importance, reoccurrence (guitar practice might repeat once a week), and prerequisites (reading something requires that book (although that could be brought somewhere), coding requires a box, jogging requires being outside) and maybe some other flags, like if it's enjoyable or not. It should either be packaged or working with a schedular app, to which I'd say, look, my day works this way (completely busy from 8 to 9:15, then 15 minutes of being inside with nothing, ..., two hours with box and possibility to go outside, etc), saying that such-and-such pattern is school and happens ever weekday except such-and-such days. The output should be of the form of a schedule, fit for printing or, when I finally get an android, mobile viewing, that schedules tasks with regards to availability of resources and importance (importance being derived from the leaf-task's parent branches), and the set of flags (all work and no play makes me a dull boy). One of these tasks should be reviewing anything that should be updated on that day, including future day layouts (e.g, if the time slots of future days have changed. This should be done every day.) Does anyone know some collection of preferably open-source (or free, or pirateable) tools, or better yet a single one, that accomplishes this task? I know python pretty well, and should be able to write any necessary glue.

    Read the article

  • Getting Dell E6320 with I7 to work with 3 monitors at 1920x1080p x 3

    - by MadBoy
    I want to buy Dell E6320 which comes with Intel Core I7-2620M (2.70GHz, 4MB cache, Dual Core) with Intel HD Graphics 3000. Laptop will come with docking station. I want to connect 3 monitors to that docking station so that working at home would give me some additional boost. Docking station will allow me to connect only 2 monitors so I'm looking at following other options: Matrox TRIPLEHEAD2GO DIGITAL Edition or TRIPLEHEAD2GO DP Edition. But reading Matrox Support Page intel GPU can't run the highest resolution with 3 monitors connected, it even gets worse since it seems monitors would have to be able to work at 50hz. Also I'm not sure but it seems that Matrox doesn't split the monitors as 3 separate monitors but simply as one big space (which is a bit opposite to what I need) Buy 2 or maybe just 1 USB based monitor but it would also mean having 1 or 2 different monitors then the main one, unless I buy 3 USB based monitors which would mean more money to spend. Also I found only couple of models and most of them require USB 3.0 and no other cables to plug in (nice but costly - couldn't find decent monitor with only USB for sending signal and having power connected normally) . But docking station has only one USB 3.0 port. Can I use hub and still get it to work? Find some converters from Digital to USB (I think DisplayLink does some?) Buy different laptop but what kind? I need it to be I7, small (13"), fast and lightweight. At same time it requires docking station that I can use at home to connect 3 external monitors. Some other suggested solution... Edit: I need 3 monitors for work in terms of coding in Visual Studio or having word/excel/outlook open. Nothing fancy. Maybe some movie once in a while.

    Read the article

  • How to publish internal data to the internet - as simple as possible

    - by mlarsen
    I Asked this at Staock Overflow, but I would like your oppinion too as it has as much to do with administration as it does with coding. We have a .net 2-tier application where a desktop program is talking to a database. We support MS SQL Server 2000, 2005, 2008 and Oracle 9, 10 and 11. The application is sold, not as shrink-wrap, but pretty close. It is quite important for us that the installation and configuration is as easy as possible as installation instructions are usually supplied in written form to the customers internal IT-department. Our application is usually not seen as mission critical for the IT-department, so we need to keep their work down to a minimum. Now we are starting to get wishes for a web application build on top of the same data. The web application will be hosted by us and delivered as a SaaS application. Now the challenge is how to move data back and forth between the web application and the customers internal database. as I see it we have some requirements: We must be ready to handle the situation where the customers database is not accessible from the DMZ. I guess the easiest solution is that all communication is initiated from inside the customers lan. As little firewall configuration as possible. The best is if we can run without any special configuration as long as outgoing traffic from the customers lan are not blocked. If we need something changed in the firewall, we must be able to document that the change is secure. It doesn't have to be real time. Moving data in batches every ten minutes or so is OK. Data moves both ways, but not the same tables, so we don't have to support merges. It would be nice if we don't have to roll our own framework completely. Looking forward to hear your suggestions.

    Read the article

  • Bash Parallelization of CPU-intensive processes

    - by ehsanul
    tee forwards its stdin to every single file specified, while pee does the same, but for pipes. These programs send every single line of their stdin to each and every file/pipe specified. However, I was looking for a way to "load balance" the stdin to different pipes, so one line is sent to the first pipe, another line to the second, etc. It would also be nice if the stdout of the pipes are collected into one stream as well. The use case is simple parallelization of CPU intensive processes that work on a line-by-line basis. I was doing a sed on a 14GB file, and it could have run much faster if I could use multiple sed processes. The command was like this: pv infile | sed 's/something//' > outfile To parallelize, the best would be if GNU parallel would support this functionality like so (made up the --demux-stdin option): pv infile | parallel -u -j4 --demux-stdin "sed 's/something//'" > outfile However, there's no option like this and parallel always uses its stdin as arguments for the command it invokes, like xargs. So I tried this, but it's hopelessly slow, and it's clear why: pv infile | parallel -u -j4 "echo {} | sed 's/something//'" > outfile I just wanted to know if there's any other way to do this (short of coding it up myself). If there was a "load-balancing" tee (let's call it lee), I could do this: pv infile | lee >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile) >(sed 's/something//' >> outfile) Not pretty, so I'd definitely prefer something like the made up parallel version, but this would work too.

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >