Search Results

Search found 102772 results on 4111 pages for 'sql server 2008'.

Page 211/4111 | < Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >

  • What happens when the server that the Remote Desktop Connection Broker goes down?

    - by Frank Owen
    I would like to setup the Remote Desktop Connection Broker to allow better load balancing of the two terminal servers we have as well as allowing the user to re-establish to the correct server if they get disconnected. My worry is, if I set this up and the server this service is running goes down, does the terminal server stop accepting connections or will they just lose the benefit of having RDCB turned on? I don't want to add another point of failure in this equation unless I have to.

    Read the article

  • Allow application to access drive while blocking direct access from users in Server 2008r2?

    - by Justin Hawk
    In Windows Server 2008 R2, can I grant permission for a specific application to access a drive, while at the same time blocking users from viewing/browsing/reading that drive? Edit: Additional Info: Users are logged in to a terminal server. The application is a 3rd party rich GUI .exe, launched by the user. It stores large images files on the hard disk. I would like the user to only be able to access these files through the application, not by browsing the disk. The application does not have a service component.

    Read the article

  • Does UNC work between two Intranet domains?

    - by mjustin
    We have some servers which will move to a new Windows 2008 domain, and some Windows 2000 servers which I would prefer to keep in their current domain for a while (until we have the resources to test and reinstall them as Windows 2008 systems in the new domain too). Can UNC still be used to connect to file server resources on the new system from the old servers? Or is UNC limited to work only within one domain? I'll do tests on Monday but every feedback would be very welcome.

    Read the article

  • How do I create a free server-side database to be accessed via Windows forms and/or browser?

    - by NoCatharsis
    I have no formal education in databasing or programming, but I've learned enough SQL, C++, and C# to at least get started setting up a small database on my company's server. Using MS SQL Server 2008 R2, I have created the database and set up columns with proper data types. However, there seems to be a lot of tweaks and details that are way over my head. Since I would like these data to be accessible to the other 7 or 8 people in my office (preferably via web browser), I'm wondering whether this is the best setup for my situation. The other option I've read about is a LAMPP server, which I assume is the competing free option to Microsoft's Express packages. I know nothing of LAMP servers except from the articles I've read on how to set them up (and I believe I even saw a detailed tutorial somewhere). To summarize, my question is this: Which of these (or any other) server setups would best suit my purposes, keeping in mind that I'm a true novice (but willing to learn), and would like to keep it free until I get more experience?

    Read the article

  • Authentication between web server and sql server

    - by thirster42
    So, we're restructuring out development environment. We've installed SQL Server onto it's own server, and we are publishing our internal web applications onto a seperate web server. However, the problem is that for some reason the authentication of the user between the web server and sql server gets lost, resulting in the Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON' error. Where do i start troubleshooting? SQL Server is set up for TCP and remote connections.

    Read the article

  • Why has ESXi 5.0 not used the software RAID configuration on my test server?

    - by kafka
    I've got a test server which was running WS 2008 Enterprise on the bare metal. It was correctly using the software RAID 1 configuration (2x250 GB disks which appeared as one disk), setup on the Dell Poweredge T110 (which meets compatibility requirements) without requiring any extra setup from me. (As an aside I'm fairly sure it's software RAID, as we didn't spec a hardware RAID controller, if that's of any importance in this situation). I am now testing installing ESXi 5.0 on this server to run some VMs. I've successfully installed ESXi, and imported a VM fine, but it's showing 2 x 250 GB disks available as datastores. However they should be appearing as one volume. When I boot the server, there is a RAID configuration screen you can enter, and I'm guessing this is what I'll have to do at some stage, but now need to be very careful because there is one disk which contains data that I want to be mirrored on the other disk. What is the best thing to do in this situation?

    Read the article

  • Setting up a externally facing server on Windows. How do i setup DNS/Nameservers?

    - by Jason Miesionczek
    So i have a domain name that i would like to host from my static ip internet connection. I have windows server 2008 r2 installed, and dns setup. The dns server is currently behind a firewall, and i have the appropriate rules to allow traffic to reach it. My question is, what entries do i need to create in the DNS so that i can have some nameservers to use at my domain registrar, so that the domain correctly points to the server? I know that most domains have nameservers like ns1.domain.com, ns2.domain.com, etc. What would i point those to in my DNS?

    Read the article

  • If my Remote Desktop Connection Broker server goes down, can users still access my two Terminal Servers?

    - by Frank Owen
    I would like to setup the Remote Desktop Connection Broker to allow better load balancing of the two terminal servers we have as well as allowing the user to re-establish to the correct server if they get disconnected. My worry is, if I set this up and the server this service is running goes down, does the terminal server stop accepting connections or will they just lose the benefit of having RDCB turned on? I don't want to add another point of failure in this equation unless I have to.

    Read the article

  • SQL Server Express with Advanced Services (with Reporting Services)???

    - by Fretwizard
    I have tried to download SQL Server 2005 Express edition about 4 times trying to find the correct version that has business intelligence studio and reporting services in it? Every time I try to unhide the advanced configuration during install, it's never there... Can anyone point me to the correct download? Looking for 2005 (not 2008) because my work SQL server that I am trying to learn this for is 2005, and the training material I have is for 2005 and VS 2008 does not want to integrate with SQL2008 express.

    Read the article

  • Asking for Credentials, when requesting shared folders on the server, regularly for domain users?

    - by MFH
    In our network, single domain controller, when some users (members of the domain) request shared folders on the server they are required to enter their credentials, the server is Windows Server 2008 R2, even checking to remember my credentials doesn't work, sometimes it shows this message: "The system has detected a possible attempt to compromise security. Please ensure that you can contact the server that authenticated you", sometimes it shows different messages, when I try to recreate the case sometimes I failed, I searched Google for it, I didn't find useful results, some talk about kerberos, but we don't use kerberos, this keeps going each day or two, how to overcome this, I don't want these messages to appear to users?

    Read the article

  • Everytime I ping my server, it is pinging localhost instead?

    - by esac
    I recently setup a new server for use with SQL. When I tried to connect via SSMS remotely, it failed. When I pinged it, it is pinging localhost, what is going on here? Please let me know if more details will help. It is Windows Server 2008. C:\>ping 0x7F000001 Pinging 127.0.0.1 with 32 bytes of data: Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128 Ping statistics for 127.0.0.1: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 0ms, Maximum = 0ms, Average = 0ms

    Read the article

  • Shares not working on boot need to reinstall "File and Printer Sharing for Microsoft Networks" on server every morning to fix

    - by Neaox
    I had a problem a few days ago see question here: Can no longer access computer or network shares to my server from any other computers on the network The fix that I found does in fact work however when I boot my PC in the morning the shares are no longer working, to fix this I need to remote desktop into the server and re-install "File and Printer Sharing for Microsoft Networks" on the main adaptor. Doing this makes the shares work again, however it is anoying to have to do this each and every morning. On top of this my offline files are no longer available offline: I store my user profile on the server and had them selected to be "Always Available" however since this has happened they are no longer available offline and the option to make them available offline from the context menu is no longer available. Another problem, and I don't know if this is the cause or just a tell of a deeper issue but this server runs hyper-v, since these problems I can no longer remote desktop into the hyper-v client. Thanks for any help anyone can give.

    Read the article

  • openVAS - Microsoft RDP Server Private Key Information Disclosure Vulnerability - false Alarm?

    - by huebkov
    I performed a openVAS scan on a Windows Server 2008 R2 and got a report for a high threat level vulnerability called Microsoft RDP Server Private Key Information Disclosure Vulnerability. An remote attacker could perform a man-in-the-middle attack to gain access to a RDP session. Affected Software is Microsoft RDP 5.2 and below. My server uses RDP 7.1, is this alarm a false alarm? Security Advisor Pages say: Solution Status Unpatched, No remedy... References http://secunia.com/advisories/15605/ http://xforce.iss.net/xforce/xfdb/21954/ http://www.oxid.it/downloads/rdp-gbu.pdf CVE: CVE-2005-1794 BID:13818

    Read the article

  • A require a server hosting package that would be suitable for several .net managed applications, accessed only by me

    - by user67166
    Hello, I currently run a server at home consisting of SQL Server 2008 .net Framework 2010 VPN Connection ASP.net Web Services running around 5-6 applications supporting a financial trading system that i regularly use. THe only user is me. Recently the requirement to have these applications running in a 24/7 100% uptime (or 99%) environment has become important. No longer can I both meet this requirement and host my server at home on my network - so i am looking to move to a dedicated hosting company. After some research, the only real companies I can find offering such services are geared towards company web-space hosting. I don't need 1TB+ bandwidth, what i need is CPU, Memory and as much control over the environment as possible. Does anyone have any examples of such a service? Thanks in advance.

    Read the article

  • Windows Server: Do I really need servers in remote locations?

    - by IMAbev
    I have one main site with several servers an a 2008/2012 environment. I have 4 remote sites that are physically close (a few miles apart) and are all connected to the main site by 20meg fiber on a private network. At each of the remote locations I have a windows server that users log in to and where their files and apps are located. There are many considerations to answering this question. But the first thing I am wondering is do I really need a server at each location? Users are just logging in to this server for permissions and a vast majority of my users are only using word, excel and email. I am really interested in figuring out if I need servers at these locations. $3,000 to $4,000 per server every 3-5 years, licensing, administration... I know there are other considerations - speed, redundancy, if my link to the main site goes down the users have nothing. But I just am not convinced I need servers at these locations.

    Read the article

  • How do you host images using Windows Server so that they are accessible over the internet? [closed]

    - by nairware
    I was trying to figure out a way to host images (picture images, not disk images) such that they are accessible over the internet via URLs--in a way similar to a web service like Photobucket or ImageShack. I have a whole bunch of Windows Servers (Windows Server 2008 R2) available in the cloud. Instead of hosting images using Photobucket or ImageShack, I wanted to host this images directly on my own Windows cloud. This could be really complicated or really simple. I have no idea, as I know very little about IIS 6 (which is what I am using) or web servers. If this is too broad of a question (as there are probably multiple ways of implementing this), is there at least some guide or documentation of how someone else has setup image hosting? Perhaps a step-by-step guide of at least one way to do it?

    Read the article

  • SQL Server 2008, Kerberos and SPN

    - by andrew007
    Hi, I installed SQL Server 2008 on a Win XP SP2 workstation in a AD domain and configured to run with the "Network Service" account. In my error log I have the following message (Event ID:26037): The SQL Server Network Interface library could not register the Service Principal Name (SPN) for the SQL Server service. **Error: 0xd, state: 13**. Failure to register an SPN may cause integrated authentication to fall back to NTLM instead of Kerberos. This is an informational message. Further action is only required if Kerberos authentication is required by authentication policies. The strange thing is that I have another SQL Server 2008 installation in a Win 2003 server configured in the same way and there I do not have this message. My questions are: Does anybody know if there are limitations with Kerberos on Windows XP and SQL Server? Why the SPN is not automatically registered on Win XP when I use the "Network Service" but it works on Windows 2003 server? THANKS!

    Read the article

  • Automating deployments with the SQL Compare command line

    - by Jonathan Hickford
    In my previous article, “Five Tips to Get Your Organisation Releasing Software Frequently” I looked at how teams can automate processes to speed up release frequency. In this post, I’m looking specifically at automating deployments using the SQL Compare command line. SQL Compare compares SQL Server schemas and deploys the differences. It works very effectively in scenarios where only one deployment target is required – source and target databases are specified, compared, and a change script is automatically generated and applied. But if multiple targets exist, and pressure to increase the frequency of releases builds, this solution quickly becomes unwieldy.   This is where SQL Compare’s command line comes into its own. I’ve put together a PowerShell script that loops through the Servers table and pulls out the server and database, these are then passed to sqlcompare.exe to be used as target parameters. In the example the source database is a scripts folder, a folder structure of scripted-out database objects used by both SQL Source Control and SQL Compare. The script can easily be adapted to use schema snapshots.     -- Create a DeploymentTargets database and a Servers table CREATE DATABASE DeploymentTargets GO USE DeploymentTargets GO CREATE TABLE [dbo].[Servers]( [id] [int] IDENTITY(1,1) NOT NULL, [serverName] [nvarchar](50) NULL, [environment] [nvarchar](50) NULL, [databaseName] [nvarchar](50) NULL, CONSTRAINT [PK_Servers] PRIMARY KEY CLUSTERED ([id] ASC) ) GO -- Now insert your target server and database details INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment1' , N'mydb1') INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment2' , N'mydb2') Here’s the PowerShell script you can adapt for yourself as well. # We're holding the server names and database names that we want to deploy to in a database table. # We need to connect to that server to read these details $serverName = "" $databaseName = "DeploymentTargets" $authentication = "Integrated Security=SSPI" #$authentication = "User Id=xxx;PWD=xxx" # If you are using database authentication instead of Windows authentication. # Path to the scripts folder we want to deploy to the databases $scriptsPath = "SimpleTalk" # Path to SQLCompare.exe $SQLComparePath = "C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe" # Create SQL connection string, and connection $ServerConnectionString = "Data Source=$serverName;Initial Catalog=$databaseName;$authentication" $ServerConnection = new-object system.data.SqlClient.SqlConnection($ServerConnectionString); # Create a Dataset to hold the DataTable $dataSet = new-object "System.Data.DataSet" "ServerList" # Create a query $query = "SET NOCOUNT ON;" $query += "SELECT serverName, environment, databaseName " $query += "FROM dbo.Servers; " # Create a DataAdapter to populate the DataSet with the results $dataAdapter = new-object "System.Data.SqlClient.SqlDataAdapter" ($query, $ServerConnection) $dataAdapter.Fill($dataSet) | Out-Null # Close the connection $ServerConnection.Close() # Populate the DataTable $dataTable = new-object "System.Data.DataTable" "Servers" $dataTable = $dataSet.Tables[0] #For every row in the DataTable $dataTable | FOREACH-OBJECT { "Server Name: $($_.serverName)" "Database Name: $($_.databaseName)" "Environment: $($_.environment)" # Compare the scripts folder to the database and synchronize the database to match # NB. Have set SQL Compare to abort on medium level warnings. $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/AbortOnWarnings:Medium") # + @("/sync" ) # Commented out the 'sync' parameter for safety, write-host $arguments & $SQLComparePath $arguments "Exit Code: $LASTEXITCODE" # Some interesting variations # Check that every database matches a folder. # For example this might be a pre-deployment step to validate everything is at the same baseline state. # Or a post deployment script to validate the deployment worked. # An exit code of 0 means the databases are identical. # # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") # Generate a report of the difference between the folder and each database. Generate a SQL update script for each database. # For example use this after the above to generate upgrade scripts for each database # Examine the warnings and the HTML diff report to understand how the script will change objects # #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") } It’s worth noting that the above example generates the deployment scripts dynamically. This approach should be problem-free for the vast majority of changes, but it is still good practice to review and test a pre-generated deployment script prior to deployment. An alternative approach would be to pre-generate a single deployment script using SQL Compare, and run this en masse to multiple targets programmatically using sqlcmd, or using a tool like SQL Multi Script.  You can use the /ScriptFile, /report, and /showWarnings flags to generate change scripts, difference reports and any warnings.  See the commented out example in the PowerShell: #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") There is a drawback of running a pre-generated deployment script; it assumes that a given database target hasn’t drifted from its expected state. Often there are (rightly or wrongly) many individuals within an organization who have permissions to alter the production database, and changes can therefore be made outside of the prescribed development processes. The consequence is that at deployment time, the applied script has been validated against a target that no longer represents reality. The solution here would be to add a check for drift prior to running the deployment script. This is achieved by using sqlcompare.exe to compare the target against the expected schema snapshot using the /Assertidentical flag. Should this return any differences (sqlcompare.exe Exit Code 79), a drift report is outputted instead of executing the deployment script.  See the commented out example. # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") Any checks and processes that should be undertaken prior to a manual deployment, should also be happen during an automated deployment. You might think about triggering backups prior to deployment – even better, automate the verification of the backup too.   You can use SQL Compare’s command line interface along with PowerShell to automate multiple actions and checks that you need in your deployment process. Automation is a practical solution where multiple targets and a higher release cadence come into play. As we know, with great power comes great responsibility – responsibility to ensure that the necessary checks are made so deployments remain trouble-free.  (The code sample supplied in this post automates the simple dynamic deployment case – if you are considering more advanced automation, e.g. the drift checks, script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have further questions)

    Read the article

  • Heroku Postgres: A New SQL Database-as-a-Service

    Idera, a Houston-based company known worldwide for its SQL Server solutions in the realms of backup and recovery, performance monitoring, auditing, security, and more, recently announced that it had won five of SQL Server Magazine's 2011 Community Choice Awards. SQL Server Magazine, a publication produced by Penton Media, offers SQL Server users, both beginning and advanced, a host of hands-on information delivered by SQL Server experts. The magazine presented Idera with 2011 Community Choice Awards for five separate products which will only serve to boost the already strong reputation of it...

    Read the article

  • Idera Releases SQL Diagnostic Manager v7.1

    Idera recently beefed up its portfolio of SQL Server management and administration tools with the release of SQL diagnostic manager 7.1. Idera has enhanced SQL diagnostic manager's already impressive set of features in version 7.1 with new additions that should appeal to database administrators. The release is another example of Idera's dedication to growing its portfolio of SQL Server solutions that has allowed the Microsoft Managed Partner to expand its client base to over 10,000 customers worldwide. The highlights of SQL diagnostic manager 7.1's new features begin with an impressive Serve...

    Read the article

  • How to Schedule Backups with SQL Server Express

    - by The Official Microsoft IIS Site
    Microsoft’s SQL Server Express is a fantastic product for anyone needing a relational database on a limited budget. By limited budget I’m talking free. Yes SQL Server Express is free but it comes with a few limitations such as only utilizing 1 GB of RAM, databases are limited to 10 GB, and it does not include SQL Profiler. For low volume sites that do not need enterprise level capabilities, this is a compelling solution. Here is a complete SQL Server feature comparison of all the SQL Server...(read more)

    Read the article

  • Querying Visual Studio project files using T-SQL and Powershell

    - by jamiet
    Earlier today I had a need to get some information out of a Visual Studio project file and in this blog post I’m going to share a couple of ways of going about that because I’m pretty sure I won’t be the only person that ever wants to do this. The specific problem I was trying to solve was finding out how many objects in my database project (i.e. in my .dbproj file) had any warnings suppressed but the techniques discussed below will work pretty well for any Visual Studio project file because every such file is simply an XML document, hence it can be queried by anything that can query XML documents. Ever heard the phrase “when all you’ve got is hammer everything looks like a nail”? Well that’s me with querying stuff – if I can write SQL then I’m writing SQL. Here’s a little noddy database project I put together for demo purposes: Two views and a stored procedure, nothing fancy. I suppressed warnings for [View1] & [Procedure1] and hence the pertinent part my project file looks like this:   <ItemGroup>    <Build Include="Schema Objects\Schemas\dbo\Views\View1.view.sql">      <SubType>Code</SubType>      <SuppressWarnings>4151,3276</SuppressWarnings>    </Build>    <Build Include="Schema Objects\Schemas\dbo\Views\View2.view.sql">      <SubType>Code</SubType>    </Build>    <Build Include="Schema Objects\Schemas\dbo\Programmability\Stored Procedures\Procedure1.proc.sql">      <SubType>Code</SubType>      <SuppressWarnings>4151</SuppressWarnings>    </Build>  </ItemGroup>  <ItemGroup> Note the <SuppressWarnings> elements – those are the bits of information that I am after. With a lot of help from folks on the SQL Server XML forum  I came up with the following query that nailed what I was after. It reads the contents of the .dbproj file into a variable of type XML and then shreds it using T-SQL’s XML data type methods: DECLARE @xml XML; SELECT @xml = CAST(pkgblob.BulkColumn AS XML) FROM   OPENROWSET(BULK 'C:\temp\QueryingProjectFileDemo\QueryingProjectFileDemo.dbproj' -- <-Change this path!                    ,single_blob) AS pkgblob                    ;WITH XMLNAMESPACES( 'http://schemas.microsoft.com/developer/msbuild/2003' AS ns) SELECT  REVERSE(SUBSTRING(REVERSE(ObjectPath),0,CHARINDEX('\',REVERSE(ObjectPath)))) AS [ObjectName]        ,[SuppressedWarnings] FROM   (        SELECT  build.query('.') AS [_node]        ,       build.value('ns:SuppressWarnings[1]','nvarchar(100)') AS [SuppressedWarnings]        ,       build.value('@Include','nvarchar(1000)') AS [ObjectPath]        FROM    @xml.nodes('//ns:Build[ns:SuppressWarnings]') AS R(build)        )q And here’s the output: And that’s it – an easy way of discovering which warnings have been suppressed and for which objects in your database projects. I won’t bother going over the code as it is fairly self-explanatory – peruse it at your leisure.   Once I had the SQL above I figured I’d share it around a little in case it was ever useful to anyone else; hence I’m writing this blog post and I also posted it on the Visual Studio Database Development Tools forum at FYI: Discover which objects have had warnings suppressed. Luckily Kevin Goode saw the thread and he posted a different solution to the same problem, one that uses Powershell. The advantage of Kevin’s Powershell approach is that it is easy to analyse many .dbproj files at the same time. Below is Kevin’s code which I have tweaked ever so slightly so that it produces the same results as my SQL script (I just want any object that had had a warning suppressed whereas Kevin was querying specifically for warning 4151):   cd 'C:\Temp\QueryingProjectFileDemo\' cls $projects = ls -r -i *.dbproj Foreach($project in $projects) { $xml = new-object System.Xml.XmlDocument $xml.set_PreserveWhiteSpace( $true ) $xml.Load($project) #$xpath = @{Start="/e:Project/e:ItemGroup/e:Build[e:SuppressWarnings=4151]/@Include"} #$xpath = @{Start="/e:Project/e:ItemGroup/e:Build[contains(e:SuppressWarnings,'4151')]/@Include"} $xpath = @{Start="/e:Project/e:ItemGroup/e:Build[e:SuppressWarnings]/@Include"} $ns = @{ e = "http://schemas.microsoft.com/developer/msbuild/2003" } $xml | Select-Xml -XPath $xpath.Start -Namespace $ns |Select -Expand Node | Select -expand Value } and here’s the output: Nice reusable Powershell and SQL scripts – not bad for an evening’s work. Thank you to Kevin for allowing me to share his code. Don’t forget that these techniques can easily be adapted to query any Visual Studio project file, they’re only XML documents after all! Doubtless many people out there already have code for doing this but nonetheless here is another offering to the great script library in the sky. Have fun! @Jamiet

    Read the article

  • How to get permission to create full-text index?

    - by Bill Paetzke
    I tried to create a full-text index and got this error: Msg 9967, Level 16, State 1, Line 1 A default full-text catalog does not exist in database 'foo' or user does not have permission to perform this action. FYI--I connected to the target sql server with Windows Authentication. What do I need to do in Sql Server 2005 and/or in Windows Server 2003 to get permissions? Please be thorough (assume I am a n00b). Thank you.

    Read the article

  • Dreamweaver CS5 Test server works but cannot connect to host server through files window

    - by Toni
    I've been managing this site for a long time and update coupons on it approximately every 60 days. For some reason, I'm not having problems: I opened DW CS5 today and made the changes necessary to update coupons. I was able to connect to the host server with no problem but most of my coupon images were not showing up. DW tells me I have 70 broken links, which can't be the case because I've reviewed them. Some links work and are the same as the broken links other than the file name. Unable to figure it out, I thought maybe restarting my Mac would help. However, upon logging back into DW, I am now unable to connect to the host server. I get an FTP error notice that the file doesn't exist or there is a permissions problem. Funny thing is, I can connect successfully if I test the connection through the Site Management window. I have connected to my host server through FileZilla and see all the files there, unfortunately, I still can't get the web pages to display the coupons. Has anyone else had this issue and if so, what is the solution? I feel like this is probably a simple fix, but I cannot for the life of me determine what it is! If anyone knows a solution, I'd really appreciate the help! -Toni

    Read the article

< Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >