Search Results

Search found 81445 results on 3258 pages for 'file command'.

Page 312/3258 | < Previous Page | 308 309 310 311 312 313 314 315 316 317 318 319  | Next Page >

  • Anonymous access to SMB share hosted on Server 2008 R2 Enterprise

    - by bwerks
    Hi all, First off, I have read through this post and a whole slew of non-SF posts which seem to address the same or a similar problem, however I was still unable to fix my problem. I've got three machines in this situation: a domain-joined server that runs Server 2008 R2 Enterprise ("share server") a domain-joined workstation running XP Pro SP3 ("test server") a domain-unjoined test server running Server 2003 R2 SP2 ("workstation") The share server is exposing a share on the network that the test server must access--it's a Source/Symbol Server share for our debugging purposes. I believe visual studio simply accesses the the share with its own credentials in this case, meaning that the share must be accessible anonymously since the test server isn't joined to the domain and there's no opportunity to supply domain authentication. I've attempted a lot of things to avoid the authentication window when accessing the share: I've enabled the Guest account on the share server and given Guest full sharing/NTFS permissions for the share. I've given ANONYMOUS LOGON full sharing/NTFS permissions for the share. I've added my share to “Network Access: Shares that can be accessed anonymously” in LSP. I've disabled “Network access: Restrict anonymous access to Named Pipes and Shares” in LSP. I've enabled “Network access: Let Everyone permissions apply to anonymous users” in LSP. Added ANONYMOUS LOGON to “Access this computer from the network” in LSP. Added the Guest account to “Access this computer from the network” in LSP. Attempted to provision the share using the Share and Storage Management MMC snap-in. Unfortunately when I attempt to access the share from the test server, I still see the prompt and I'm forced to enter "Guest" manually. I also tried this workflow using the local administrator account on a workstation, and the same thing happens both with and without XP Simple File Sharing enabled. Any idea why I'm getting these results, or what I should have done differently?

    Read the article

  • How to serve media across home network?

    - by TK Kocheran
    I'm looking to share my media across my home network. Router fully supports running a DLNA server, but I don't know if it'd be better to run the server from my main server computer instead of from the router, as the router would have to operate off of a network share and my server can operate directly off of the files. Here's what I need to serve, in order of importance: ISO 1:1 DVD rips (4-8GB files), MP4/H.264 encoded videos, MKV videos, MP3 files, JPEG/CR2 images. Maybe I'm completely ludicrous for wanting to push full DVD files across my network, but in reality, I would assume that only the parts of the actual file needed (ie: menu, main video payload for main title) would be served at any one time. Plus, encoding takes time and precious disk space, so why not stream it 1:1 ;) Does anyone know of the best way to accomplish this? Main goal is to serve it to Logitech Revue downstairs and secondary goal is to serve it to other computers in the house. For music, I assume I could run a DAAP server, but I don't think that the Revue supports that (and I can't exactly throw together an app that does it just yet).

    Read the article

  • Administrative shares in Windows 7 Pro not visible

    - by Chris Tybur
    My desktop machine has a clean install of Windows 7 Professional. For some reason the standard administrative shares Admin$, C$, D$, etc are not visible, either in Computer Management - Shared Folders - Shares or via net share. I also have a laptop with a clean install of Windows 7 Professional, and I can see the admin shares in both places. As such, I can map to \\laptop\c$ from the desktop, but I can't map to \\desktop\c$ from the laptop. I pretty much took the defaults during the Windows 7 installations. I've tried adding LocalAccountTokenFilterPolicy to the registry on the desktop, but that didn't work. On the desktop I've also disabled UAC, turned off Windows firewall, removed it from a homegroup, made sure file and printer sharing is turned on, but nothing has worked. There is some subtle difference between the two machines that I can't seem to find. I'm logging into both machines using a local account that is in the Administrators group. Both accounts have the same name and password. I really don't want to have to create a new share for the desktop's C drive, especially since C$ is visible and working on the laptop and therefore I should be able to make it work on the desktop. Any idea why the admin shares would work on one machine and not another? Or why LocalAccountTokenFilterPolicy would fail?

    Read the article

  • External Storage for 2TB of backups and 4TB of data RAID level? HW vs Software?

    - by Jerry Mayers
    I have a Mac Mini set up as a media center/file server. Currently I just have a hodgepodge mess of external drives for storage. I'm maxed out, and I have some new laptops on the way with much larger drives and I need to work out a good storage solution for backing them up, as well as storing media on the server. I need around 2 TB of storage for the time machine backups from my various systems and around 2 TB more for media. I would like to build this to handle around 6 TB total so I have some growing room. Since I'm using a Mac Mini as the server I need to use external enclosure(s) that support USB 2 or Firewire 800 (preferred) or gigabit Ethernet. Performance of the system isn't a huge concern since the majority of the access from other computers is done over 802.11N. I plan on using 2TB drives, for the final version, but initially I'll try and use my existing 2 (1TB) drives + some new 2TB drives, and swap the 1TB ones out as I fill up. As to the actual questions: Should I use hardware RAID in some enclosure? Because if the enclosure dies I have to find an identical one to get to my data right? Wouldn't a software RAID be better as I can use any method of connecting the drives to the system? Remember OS X server is my OS. What if I had to reinstall OS X, can I restore the software RAID easily? What RAID version should I use? For the 2TB used for the time machine disk I don't see why I need RAID here, just a single 2TB drive since its already the backup, but for the remaining 4TB it would be the only copy of the data so I should build some redundancy. I had a RAID 5 setup using a cheep RAID PCI card years ago running RAID 5 in a 2 TB array and when a drive died it wanted 48 hours to rebuild. Is this crazy slow for a setup of this size or is this to be expected? Any suggestions as to drive enclosures?

    Read the article

  • batch file to deploy files

    - by Martin Michalak
    hi I have created batch file which pulls info from *.txt file and deploy code from the source to destination: SET Source=%1 if exist %Source% ( ECHO Source for WEB exists ) else ( ECHO Wrong build%Source% doesn't exist GOTO Menu ) SET Server=%2 SET AppPool=%3 SET Destination=%4 SET Folder=%5 SET ENV=%6 SET AppName=%7 SET Envlog=%8 ECHO Deployment of WEB > %Envlog% %Date% %Time% echo. @ECHO Stopping App Pools @ECHO Stopping App Pools >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE -d \\%Server% cmd.exe /c c:\windows\system32\inetsrv\appcmd STOP apppool /apppool.name:%AppPool% echo. @ECHO App Pools will be stopped in the background @ECHO App Pools will be stopped in the background >> %Envlog% %Date% %Time% Pause echo. IF EXIST "%Destination%" ( ECHO Deleting %AppName% %Folder% RMDIR %Destination% /s /q ECHO Destination Folder %Folder% Deleted ECHO Destination Folder %Folder% Deleted >> %Envlog% %Date% %Time% ) else ( ECHO Destination Folder %Destination% does not exist, please check ECHO Destination Folder %Destination% does not exist, please check >> %Envlog% %Date% %Time% Pause ) echo. @ECHO Starting Robocopy for %AppName% @ECHO Starting Robocopy for %AppName% >> %Envlog% %Date% %Time% echo. START /WAIT /MIN ROBOCOPY.EXE %Source% %Destination% *.* /S /NP /R:3 /W:5 /LOG:"Logs\Robo%AppName%%ENV%.log" D:\Tools\Windiff\windiff.exe %Source% %Destination% echo. @ECHO Finished with Robocopy @ECHO Finished with Robocopy >> %Envlog% %Date% %Time% echo. @ECHO Checking if App pools stopped: @ECHO Checking if App pools stopped: >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE \\%Server% c:\windows\system32\inetsrv\appcmd LIST apppool /apppool.name:%AppPool% @echo off set /p ask=All app pools stopped? (y/n) if %ask%==y (echo Great, please continue with deployemnt) else echo Before continuing please check why app pools did not stop @echo App pools stopped?: %ask% >> %Envlog% %Date% %Time% DEL %Source%\web.config echo. @ECHO Production Config check if exist "%Destination%\%ENV%-Web.config" ( echo. ECHO The Application production configuration file does exist. ECHO The Application production configuration file does exist. >> %Envlog% %Date% %Time% COPY %Destination%\%ENV%-Web.config web.config echo. ECHO Production %ENV%-Web.config has been renamed to web.config ECHO Production %ENV%-Web.config has been renamed to web.config >> %Envlog% %Date% %Time% ) else ( ECHO The Application production configuration file is missing in Production %AppName% ECHO The Application production configuration file is missing in Production %AppName% >> %Envlog% %Date% %Time% explorer %Destination% Pause ) echo. @ECHO Confirm that configs were renamed correclty, if yes please hit any key to START APP Pools @ECHO Confirm that configs were renamed correclty, if yes please hit any key to START APP Pools >> %Envlog% %Date% %Time% Pause echo. @ECHO Start %AppName% Application Pool >> %Envlog% %Date% %Time% D:\ICTTools\PSEXEC.EXE \\%Server% c:\windows\system32\inetsrv\appcmd START apppool /apppool.name:%AppPool% @echo off set /p ask=All app pools started? (y/n) if %ask%==y (echo Great, please continue with deployemnt) else echo Before continuing please check why app pools did not start @echo App pools started?: %ask% >> %Envlog% %Date% %Time% Pause echo. @ECHO Build Version for %AppName% @ECHO Build Version for %AppName% >> %Envlog% %Date% %Time% type %Destination%\buildinfo.xml echo. ECHO ............................................... @ECHO ...........Deployment Compelted................ @ECHO ...........Deployment Compelted................>> %Envlog% %Date% %Time% ECHO ............................................... here are my issues: Lets say I am running code for 3 servers, then for each instance: For all three servers I am performing destination folder delete even so destination folder is always the same, the code should only delete it in the 1st instance (when code is deployed to first server) then I would prefer if script would check if the code from the source and destination is the same and if it is it should delete the folder or not. Then based on 1: a) deleting web.config and renaming should only happen if code in destination is new b) Robocopy should not override files if they are the same I think there is /Xo option to do that any idea how to achieve that? :)

    Read the article

  • Shared Files stuck locked even after closing all sessions

    - by Chris S
    We run a business app from a shared network drive (has to be this way). When I go to do updates it complains that files are locked. Generally there are open sessions from people who left their computer on, but with no locks on files; there aren't necessarily always sessions open when it complains about locked files. If I close these sessions they disappear. I say "disappear" because I suspect they're actually hanging open. If I try to restart the Server service, it hangs on stopping. Restarting the whole server (it's a VM) unlocks the files. The Server is a Windows 2008 R2 Ent VM running on Hyper-V; the share is accessed through DFS. Offline Files and caching are disabled (Share and GPO). All clients are Win7. Nothing has SP1 yet. Any ideas on what causes the file locks to hang? Any ideas for a solution other than rebooting the server every time?

    Read the article

  • Can a USB/IDE/SATA adapter be flaky?

    - by Ward
    I use USB/IDE/SATA converters a lot and on the two that I have now, I sometimes get errors copying files to drives. It only happens when I'm copying big files to the drive (big can mean as little as 100MB, I think it happens more often with bigger files - 300MB or more), and basically the copy will fail and I'll get one or more error messages about "Delayed write failed." But if I disconnect the drive and re-connect it, I'll usually be able to continue. (The file that was being copied will be corrupt, but otherwise the drive is fine.) I just noticed a new type of flakiness: the data transfer rate can vary widely. I copied one set of files (5x300MB files) and it took 10+minutes, then I copied another set (approx. the same sizes) and it took less than a minute. I haven't done systematic testing, the other things I'm doing on my laptop at the same time might have some impact, and I haven't cross-checked the two adapters I have and the 3 hard drives I'm working with to see if there's a pattern. I'm more wondering if anyone else has seen anything like this.

    Read the article

  • Puppet write hosts using api call

    - by Ben Smith
    I'm trying to write a puppet function that calls my hosting environment (rackspace cloud atm) to list servers, then update my hosts file. My get_hosts function is currently this: require 'rubygems' require 'cloudservers' module Puppet::Parser::Functions newfunction(:get_hosts, :type => :rvalue) do |args| unless args.length == 1 raise Puppet::ParseError, "Must provide the datacenter" end DC = args[0] USERNAME = DC == "us" ? "..." : "..." API_KEY = DC == "us" ? "..." : "..." AUTH_URL = DC == "us" ? CloudServers::AUTH_USA : CloudServers::AUTH_UK DOMAIN = "..." cs = CloudServers::Connection.new(:username => USERNAME, :api_key => API_KEY, :auth_url => AUTH_URL) cs.list_servers_detail.map {|server| server.map {|s| { s[:name] + "." + DC + DOMAIN => { :ip => s[:addresses][:private][0], :aliases => s[:name] }}} } end end And I have a hosts.pp that calls this and 'should' write it to /etc/hosts. class hosts::us { $hosts = get_hosts("us") hostentry { $hosts: } } define hostentry() { host{ $name: ip => $name[ip], host_aliases => $name[aliases] } } As you can imagine, this isn't currently working and I'm getting a 'Symbol as array index at /etc/puppet/manifests/hosts.pp:2' error. I imagine, once I've realised what I'm currently doing wrong there will be more errors to come. Is this a good idea? Can someone help me work out how to do this?

    Read the article

  • how do you view / access the contents of a mounted dmg drive through TERMINAL hdiutil diskmount

    - by A. O.
    My external USB drive failed. I made a .dmg image file of the drive using disk utility. Later I was not able to mount the .dmg image. I used terminal hdiutil attach -noverify -nomount name.dmg diskutil list diskutil mountDisk /dev/disk4 then received the following message: Volume(s) mounted successfully However, I cant see the drive or access its contents through Finder. DUtility shows the drive as ghost but I still cant mount it using diskutility. Terminal tells me that the drive is mounted and constantly shows it in the diskutil list. pwd is not the mounted .dmg image. I dont know how to enter into the mounted image drive to see its contents. So in case what I said sounds like I see the files in the mounted image no this is not the case. I do not know how to access or even change the pwd within Terminal. I was hoping to see the mounted drive tru finder but I do not see that. So I need help as to how to find a way to access the mounted image drive if it was really mounted. Terminal says that it was and it shows it under diskutil list as a /dev/disk4. Can someone please help me access the files on this drive?

    Read the article

  • Determining physical location of data on a disc

    - by Synetech
    Does anybody know of a way to find out where, physically on a CD or DVD a given piece of data would be located? I am trying to watch a DVD at the moment, and am about half-way through, but it keeps dying at a specific spot in the film, presumably because of a scratch. I have a repair kit, but I don’t know where to focus my repair because there are several scuffs and scratches on the disc and I have no way of knowing which one is causing the issue. Obviously, cleaning all of them is inadvisable because not only does it waste the consumable materials in the kit, but not all of them are a problem, and by working them, some may become unreadable. Moreover, just because I am half-way through the movie does not mean that it would be half-way from the hub to the edge for several reasons: Discs have more data towards the outer edge than the inner edge (circles are more mathematically complicated than rectangles) The disc is not completely filled up (and even if it were, the movie itself would be be using it all, there are extras and such) Because in this particular case it is a commercial DVD, it is also dual-layer which further complicates manual determination As such, I am trying to find a program that can let me identify a file (or part thereof), cluster, etc. and show me a picture of where on the CD/DVD it would be located. That way, I can look at the disc and fix any scratches that correspond to that distance from the hub. For example, the image below might indicate where on a disc a couple of files or range of clusters would be located, so by looking for anomalies in those areas (rotating as necessary), the correct one can be identified. I’m sure it can be done since at least one form of copy protection (DPM) uses it and DVD-lab Pro includes a “DVD Topology” feature to do this.

    Read the article

  • How can I get write permission for the Web (Inetpub) directory on a new Win 7 machine?

    - by marcipollo
    I mirror my Web site on my laptop, and am trying to move the mirror site to a new laptop. I copied the files to the Inetpub directory, and can view them perfectly, but they are read-only (the check-mark is grey, not black), and I cannot change the permission. When I un-check the read-only attribute on the Inetpub directory, and click "apply" it displays a dialog box stating that I need administrative permission to change the attributes. (I am logged in as an administrator). When I click "continue," it pops up another dialog box saying access is denied to the attributes of the file: c:\inetpub\custerr\en-us\500-100.asp That dialog box has an "ignore" button, and if I click that, it appears to work through the directory tree setting the permissions. It leaves all of the files (leafs) set to "read-write," but the directories remain "read only." I am using 64-bit Windows 7. I stopped the IIS service while doing all of this. Might it have something to do with the fact that I copied the files from a different machine in the workgroup (my old laptop)?

    Read the article

  • One Way Sync with Dropbox?

    - by user244805
    Is there any way I can mirror a dropbox folder to my C drive by just running a portable file? Extra background information because I know you guys hate it when you don't get the entire situation: I go back to University in fall and I need a new storage solution. I decided to use DropBox to sync my tiny University files (< 5 MB). I need to access these files from 4 machines: Windows 7 Home machine Windows 7 University A machine Windows 7 University B machine Android tablet 1 and 4 are a non-issue. The problem lies with 2 and 3. I want to be able to edit my files on 2 and 3 but those machines are not mine. There is an easy fix. Run a portable version of the DropBox syncer on a USB drive. But the problem is that I don't want to carry a USB drive around with me all the time. In that case, I can just run the small portable DropBox syncer off the internet. But where will it to store the files? A temporary directory on the C drive. There is only one issue left: there are hundreds of machines that I will randomly use that fit in categories 2 and 3. My portable DropBox syncer will notice that the temporary directory is empty on each new PC I use and instead of downloading my DropBox folder to the machine, it syncs the other way around i.e. it deletes my entire DropBox. The solution is to mirror my DropBox onto the temporary directory before running the DropBox syncer.

    Read the article

  • How to set an executable white list?

    - by izabera
    Under Linux, is it possible to set a white-list of executables for a certain group of users? I need them to be unable to use, for example, make, gcc and executables on removable disks. How can this be done? Edit, let me explain better. I'm dealing with a high school IT system, young geeks that (during the lessons) want to play, surf the net, damage those computer however they can. The major step to achieve this goal was to remove the system they're familiar with and install Ubuntu in all the computers. This actually works quite well, but recent events proved that this is not enough. I want to allow them to execute certain safe programs, like Open Office, and to deny any other program, whether it is preinstalled software, something they carry in usb drives, a downloaded program or a script they program on site. It's possible to remove the 'x' permission on any file on the pc, but of course it would be impractical. Furthermore, they would be able to run anything they download. I thought the best solution would be to make a white-list of safe programs and to deny anything else, but I don't really know how to do it. Any idea is helpful.

    Read the article

  • How to automate downloading files?

    - by Damon
    I got a book which had a pass to access digital versions of hi-res scans of much of the artwork in the book. Amazing! Unfortunately the presentation of all the these are 177 pages of 8 images each with links to zip files of jpgs. It is extremely tedious to browse, and I would love to be able to get all the files at once rather than sitting and clicking through each one separately. archive_bookname/index.1.htm - archive_bookname/index.177.htm each of those pages have 8 links each to the files linking to files such as <snip>/downloads/_Q6Q9265.jpg.zip, <snip>/downloads/_Q6Q7069.jpg.zip, <snip>/downloads/_Q6Q5354.jpg.zip. that don't quite go in order. I cannot get a directory listing of the parent /downloads/ folder. Also, the file is behind a login-wall, so doing a non-browser tool, might be difficult without knowing how to recreate the session info. I've looked into wget a little but I'm pretty confused and have no idea if it will help me with this. Any advice on how to tackle this? Can wget do this for me automatically?

    Read the article

  • How do I fully share a Hard Drive on my Local Network?

    - by GingerLee
    I have 4 computers connected to a router (DD-WRT) My main PC is Windows 7 (Home Premium). This machine has 2 Hard Disks: HD1 is used for my OS and the other (HD2) is used to store files. My 3 other machines are 1. Ubuntu Destop that I use to learn about linux, 2. A Mac OSX laptop, and 3. A netbook running windows 7. How do I easily share HD2 with my other machines? I would like all my machines to have full access & permissions to HD2 however I would like to RESTRICT access to only PCs that are connected to my router (either via LAN and WiFi) --- btw, I know this is not very secure due to WiFi vulnerability , however, I currently MAC address restrict WiFi connections my router. Extra Info: I have already tried to use the Windows Folder Sharing feature: i.e. I right click over the icon of HD2, and click on the Sharing Tab, but in sub-window labeled "Network File and Folder Sharing", the "Share" button is grayed out. I can click on "Advanced Shared" but that just takes me to a screen in which I have to set certain permissions. What is not clear to me is: How do I set a criteria that shares HD2 with all computer connected to my router?

    Read the article

  • Need software to convert RAR to ZIP ("store" mode - no compression) * Extract->re-archive changes date/time attributes

    - by Larry78
    I have tried all of the following applications (download.cnet.com - the free ones), and none of them will convert an RAR archive to a ZIP, without compressing the files ("store" mode). 7-ZIP would be fine, too. [The RAR is a "solid" archive with password (I know it), files "stored" - no compression, used WinRAR 3.5.1] PeaZip, 7-Zip, FilZip, TugZip, SimplyZipSE, QuickZip, WinShrink. (A couple of the apps let you try, but the program gives an error - indicating how bad the software is. (Like "unknown header # #.") None of these apps will do the conversion at all. IZArc 4.1 comes the closest. It will convert an RAR to a ZIP, but it compresses the zip. There is a general preference setting to "store" - but it doesn't effect conversions. I don't want to extract the RAR files and re-archive them because I need to preserve the modified/created file attributes. IZArc preserves them, but it compresses the files. WinRAR has the option to convert archives, but I get the error "skipping encryped archive" when I try to convert it. It asks for the password first, and I know it's right because that password opens the archive, and I can read/view all the files in the archive.

    Read the article

  • My Hard Drive isn't Working

    - by MeCB
    I never use Safely Remove Hardware with Windows XP. It has been working for me for years with my SD card, mouse, hard drive and memory stick. My hard drive has a USB cable and power cord so I can hook it up to any desktop hard drive. Now my hard drives don't like this and I did not know this till now. I am always careful and to wait till it is all finish accessing the USB before I unplug it. Now three of my hard drives can't be seen by Windows, though the others still work. when I hook it up to another computer it works fine. I use the same USB cable to hook up all of my hard drives one at a time. So my USB cable is good. I think that when I unplugged the hard drive this one time, it had a file it still wanted to see and now only this drive does not work only this computer. Then the same thing happens to my other two hard drive after I used it for a week with the same cables. How can I fix this?

    Read the article

  • psql: FATAL: could not write init file

    - by Leonardo M. Ramé
    as the title points out, I'm getting this error when trying to connect to a PostgreSql database from command line, using PostgreSQL. The client machine is an Ubuntu 11.10 x86_64 and the PostgreSQL libraries are from Version 9.1 Server is PostgreSql 8.3. This is the command that I executed: psql -U postgres -d my_database -h 192.168.0.161 -p 5432 -c "select * from xxyy" I get the same results when I use sudo or su postgres. The sad thing is that I can connect without problems using pgAdmin. Any hint?

    Read the article

  • Adding a hyperlink in a client report definition file (RDLC)

    - by rajbk
    This post shows you how to add a hyperlink to your RDLC report. In a previous post, I showed you how to create an RDLC report. We have been given the requirement to the report we created earlier, the Northwind Product report, to add a column that will contain hyperlinks which are unique per row.  The URLs will be RESTful with the ProductID at the end. Clicking on the URL will take them to a website like so: http://localhost/products/3  where 3 is the primary key of the product row clicked on. To start off, open the RDLC and add a new column to the product table.   Add text to the header (Details) and row (Product Website). Right click on the row (not header) and select “TextBox properties” Select Action – Go to URL. You could hard code a URL here but what we need is a URL that changes based on the ProductID.   Click on the expression button (fx) The expression builder gives you access to several functions and constants including the fields in your dataset. See this reference for more details: Common Expressions for ReportViewer Reports. Add the following expression: = "http://localhost/products/" & Fields!ProductID.Value Click OK to exit the Expression Builder. The report will not render because hyperlinks are disabled by default in the ReportViewer control. To enable it, add the following in your page load event (where rvProducts is the ID of your ReportViewerControl): protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { rvProducts.LocalReport.EnableHyperlinks = true; } } We want our links to open in a new window so set the HyperLinkTarget property of the ReportViewer control to “_blank”   We are done adding hyperlinks to our report. Clicking on the links for each product pops open a new windows. The URL has the ProductID added at the end. Enjoy!

    Read the article

  • Using XNA ContentPipeline to export a file in a machine without full XNA GS

    - by krolth
    My game uses the Content Pipeline to load the spriteSheet at runtime. The artist for the game sends me the modified spritesheet and I do a build in my machine and send him an updated project. So I'm looking for a way to generate the xnb files in his machine (this is the output of the content pipeline) without him having to install the full XNA Game studio. 1) I don't want my artist to install VS + Xna (I know there is a free version of VS but this won't scale once we add more people to the team). 2) I'm not interested in running this editor/tool in Xbox so a Windows only solution works. 3) I'm aware of MSBuild options but they require full XNA I researched Shawn's blog and found the option of using Msbuild Sample or a new option in XNA 4.0 that looked promising here but seems like it has the same restriction: Need to install full XNA GS because the ContentPipeline is not part of the XNA redist. So has anyone found a workaround for this?

    Read the article

  • 30 seconds from File|New to a new CRUD Silverlight application with Teleriks new LINQ Implementation

    Last month Telerik released its new LINQ implementation and last week we released the new Data Services Wizard for Telerik OpenAccess, which supports both traditional OpenAccess entities and the new LINQ implementation. I will a walk you through the process where you can connect to a database, add a new domain model, wrap it in a new WCF Data Services (Astoria) service, and add a CRUD enabled Silverlight application. All in 30 seconds! Step 1: Build your Domain Model (20 seconds) Open Visual Studio 2010 RTM (or 2008) and add a new ASP.NET project. Right click on the project and select Add|New Item and choose Telerk OpenAccess Domain Model from the item template list. The Visual Entity Designer wizard comes up. Select the database server you are using in the first screen (SQL Server, Oracle, SQL Azure, MySQL, etc) and then also build your database connection string. Next select the tables, views, and stored procedures you want ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Large File Upload in SharePoint 2010

    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). This feed URL has been discontinued. Please update your reader's URL to : http://feeds.feedburner.com/winsmarts Read full article .... ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • NTFS partitions hidden under EXT4 file system / partition...want to recover files from NTFS

    - by user7534
    I am new to ubuntu, but very impressed with the system. so one day i tried installing ubuntu 10.10 along with windows in dual boot first place it didnt get installed properly and during second attempt i could do it right but oh...i lost my windows 7 , here is my problem and what i have done till now. i have hdd installed with ubuntu same disk have windows partitions and i need to extract data from those ...very very important i tried to access the same from ubuntu ...can not access it, 3.reinstalled the windows 7 , hdd is not detected 4.during installation ubuntu gone , so reintalled scan in ubuntu says hdd is fine and DiskInternals linux reader actual show the NTFS partitions , recovery tool not able to get any data out. , please help i need data from these partitions...please I feel that i have put ext4 partition on ntfs filesystem...and now not able to access it

    Read the article

  • Unix list absolute file name

    - by Matthew Adams
    Given an arbitrary single argument representing a file (or directory, device, etc), how do I get the absolute path of the argument? I've seen many answers to this question involving find/ls/stat/readlink and $PWD, but none that suits my need. It looks like the closest answer is ksh's "whence" command, but I need it to work in sh/bash. Assume a file, foo.txt, is located in my home directory, /Users/matthew/foo.txt. I need the following behavior, despite what my current working directory is (I'm calling the command "abs"): (PWD is ~) $ abs foo.txt /Users/matthew/foo.txt $ abs ~/foo.txt /Users/matthew/foo.txt $ abs ./foo.txt /Users/matthew/foo.txt $ abs /Users/matthew/foo.txt /Users/matthew/foo.txt What would "abs" really be? TIA, Matthew

    Read the article

< Previous Page | 308 309 310 311 312 313 314 315 316 317 318 319  | Next Page >