Search Results

Search found 5623 results on 225 pages for 'prevent deletion'.

Page 59/225 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • How to prepare WiFi for an on-stage demo?

    - by Jeremy White
    Today at WWDC, Steve Jobs gave his keynote and ended up having a failure on-stage when connecting to WiFi. Google had a similar issue a few weeks ago in the same conference center. Please reference the following article for more information. http://news.cnet.com/8301-31021_3-20007009-260.html I am looking for information on how to best prepare a demo which uses a closed wireless network in front of a large audience. Note that the network will be closed, and will not require internet access. What steps can I take to prevent interference from existing WiFi, Bluetooth, etc? How can I best prevent curious/malicious people from trying to intrude on my WiFi network? I am open to recommendations on specific models of routers.

    Read the article

  • iTunes Sync erasing Exchange calendar / contacts

    - by Garrett
    We have had a handful of instances where corporate iPhone users will be syncing Calendar/Contacts/etc in their iTunes settings, and we would like to prevent this. Unfortunately, when they sync their empty home Outlook calendars it then overwrites everything in their iPhone calendar. This has the unpleasant side effect up "updating" Exchange and wiping out every meeting they have. Luckily, our backups have bailed us out in each case - there seems to be no recovering from it any other way as the data is gone. We prefer to allow our users to continue loading media on their phones, which we believe requires iTunes. Is there a way, through Exchange ActiveSync or iOS mobile management, to prevent this from happening?

    Read the article

  • securing unpatched websites

    - by neuron
    I have a client with a lot (read several thousand) websites in several old cms solutions that are no longer maintained. Now moving all of them to a maintained solution isn't really an option at this point. So I'm thinking about ways to secure the solutions without patching them. The solutions are mostly joomla 1.0/1.5 and wordpress. What I'm thinking is something like this: mod_suexec to lock everyone into their own home directory apparmor to deny any and all file writes by default. (exclude by default, include things like "images" directories). use htaccess to prevent anything in writable directories from being executed. (aka disable php_engine for images/ directory). mysql triggers to check the "users" tables to prevent adding new admins/superadmins. Does this make sense? Is it viable? Am I missing something obvious?

    Read the article

  • Concerns about a Dedicated (Windows Server 2008) + DDoS

    - by TheKillerDev
    I am have today a dedicated server with these specs: Intel Core i5 750, 2x120GB (ssd + raid), Windows Server 2008 Web, 200Mbps Network, 24 Gb DD3 And I would like to know what are the best thing I can do to prevent a DDoS Attack, since I know this will be a real threat by the importance of the files that will be archived in it. Today I have apache listening port 80 and RDC listening port 3389. But the security is beeing made only by Windows Firewall. So, any thoughts on what would be good to prevent from DDoS attacks?

    Read the article

  • Setting permissions on user accounts

    - by Ron Porter
    We would like to lock a couple of accounts to prevent even domain admins from resetting the password without already knowing the current password. From what I can see in the permission sets, this looks possible. Anything I've found on the subject recommends against altering default permissions, but doesn't go into detail why. Assuming that domain admin retains the ability to reset passwords without knowing current passwords is it reasonable to prevent password resets on the domain admin account and maybe a couple of others? If not, why not?

    Read the article

  • Grandma's Computer - Can a user that belongs only to the "Users" group in Windows XP install malware, virus or IE addons?

    - by DanC
    I am trying to figure out if having a user in the "Users" group will be enough to prevent her from install unwanted software. The things that I don't want the user to be able to install are: virus malware bandoo stuff Internet Explorer Addons To put you in context, I am thinking of my grandma's computer, I want her to be able to read all her email stuff and attachments, but without the hassle of needing to reinstall the whole computer every few months. The computer will run Windows XP, with some free antivirus. It will not be part of any domain. It is just a home computer. Linux, I have tried making her use it, but she was already accustomed to Windows and was not really an option to have her re-learn where was the shutdown button. So, are these considerations enough to prevent her installing unwanted software? What other options come to you mind? Thanks

    Read the article

  • In Windows XP, is it possible to disable user credential caching for particular users

    - by kdt
    I understand that when windows caches user credentials, these can sometimes be used by malicious parties to access other machines once a machine containing cached credentials is compromised, a method known as "pass the hash"[1]. For this reason I would like to get control over what's cached to reduce the risk of cached credentials being used maliciously. It is possible to prevent all caching by zeroing HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\CachedLogonsCount, but this is too indiscriminate: laptops users need to be able to login when away from the network. What I would like to do is prevent the caching of credentials of certain users, such as administrators -- is there any way to do that in Windows XP? http://www.lbl.gov/cyber/systems/pass-the-hash.html

    Read the article

  • SQL Server CE rollback does not undo delete.

    - by INTPnerd
    I am using SQL Server CE 3.5 and C# with the .NET Compact Framework 3.5. In my code I am inserting a row, then starting a transaction, then deleting that row from a table, and then doing a rollback on that transaction. But this does not undo the deletion. Why not? Here is my code: SqlCeConnection conn = ConnectionSingleton.Instance; conn.Open(); UsersTable table = new UsersTable(); table.DeleteAll(); MessageBox.Show("user count in beginning after delete: " + table.CountAll()); table.Insert( new User(){Id = 0, IsManager = true, Pwd = "1234", Username = "Me"}); MessageBox.Show("user count after insert: " + table.CountAll()); SqlCeTransaction transaction = conn.BeginTransaction(); table.DeleteAll(); transaction.Rollback(); transaction.Dispose(); MessageBox.Show("user count after rollback delete all: " + table.CountAll()); The messages indicate that everything works as expected until the very end where the table has a count of 0 indicating the rollback did not undo the deletion.

    Read the article

  • How to disable 3G USB Modem internal storage from being loaded by linux kernel?

    - by Krystian
    Hi, I've got a problem with my 3G modem [Huawei E122]. It has internal storage and kernel assigns a device [/dev/sdX] to it. Because of that, every second time my machine will not boot - kernel panic - as my usb hdd gets assigned /dev/sdb instead of /dev/sda. I cannot use LABEL nor UUID in root= kernel parameter, as it is only available when using initrd, and I can't use it - I am using Debian on my router - mips architecture machine. I have to prevent this from happening, as my router has to start everyday and I have to be sure it works ok. I don't have physical access to restart it when something goes wrong. I don't use my modem internal storage, there's no SD card inserted. However kernel detects the reader and loads it. I can not prevent loading od usb drivers since my hdd is on USB as well. I will appreciate any ideas.

    Read the article

  • Problem with the output of Jquery function .offset in IE

    - by vandalk
    Hello! I'm new to jquery and javascript, and to web site developing overall, and I'm having a problem with the .offset function. I have the following code working fine on chrome and FF but not working on IE: $(document).keydown(function(k){ var keycode=k.which; var posk=$('html').offset(); var centeryk=screen.availHeight*0.4; var centerxk=screen.availWidth*0.4; $("span").text(k.which+","+posk.top+","+posk.left); if (keycode==37){ k.preventDefault(); $("html,body").stop().animate({scrollLeft:-1*posk.left-centerxk}) }; if (keycode==38){ k.preventDefault(); $("html,body").stop().animate({scrollTop:-1*posk.top-centeryk}) }; if (keycode==39){ k.preventDefault(); $("html,body").stop().animate({scrollLeft:-1*posk.left+centerxk}) }; if (keycode==40){ k.preventDefault(); $("html,body").stop().animate({scrollTop:-1*posk.top+centeryk}) }; }); hat I want it to do is to scroll the window a set percentage using the arrow keys, so my thought was to find the current coordinates of the top left corner of the document and add a percentage relative to the user screen to it and animate the scroll so that the content don't jump and the user looses focus from where he was. The $("span").text are just so I know what's happening and will be turned into comments when the code is complete. So here is what happens, on Chrome and Firefox the output of the $("span").text for the position variables is correct, starting at 0,0 and always showing how much of the content was scrolled in coordinates, but on IE it starts on -2,-2 and never gets out of it, even if I manually scroll the window until the end of it and try using the right arrow key it will still return the initial value of -2,-2 and scroll back to the beggining. I tried substituting the offset for document.body.scrollLetf and scrollTop but the result is the same, only this time the coordinates are 0,0. Am I doing something wrong? Or is this some IE bug? Is there a way around it or some other function I can use and achieve the same results? On another note, I did other two navigating options for the user in this section of the site, one is to click and drag anywhere on the screen to move it: $("html").mousedown(function(e) { var initx=e.pageX var inity=e.pageY $(document).mousemove(function(n) { var x_inc= initx-n.pageX; var y_inc= inity-n.pageY; window.scrollBy(x_inc*0.7,y_inc*0.7); initx=n.pageX; inity=n.pageY //$("span").text(initx+ "," +inity+ "," +x_inc+ "," +y_inc+ "," +e.pageX+ "," +e.pageY+ "," +n.pageX+ "," +n.pageY); // cancel out any text selections document.body.focus(); // prevent text selection in IE document.onselectstart = function () { return false; }; // prevent IE from trying to drag an image document.ondragstart = function() { return false; }; // prevent text selection (except IE) return false; }); }); $("html").mouseup(function() { $(document).unbind('mousemove'); }); The only part of this code I didn't write was the preventing text selection lines, these ones I found in a tutorial about clicking and draging objects, anyway, this code works fine on Chrome, FireFox and IE, though on Firefox and IE it's more often to happen some moviment glitches while you drag, sometimes it seems the "scrolling" is a litlle jagged, it's only a visual thing and not that much significant but if there's a way to prevent it I would like to know.

    Read the article

  • Does PCI bus really occupy 1.8GB

    - by Neil
    I am using a Dell Vostro 1700 laptop which currently has 2GB of RAM. I was considering buying some more memory to upgrade it to 4GB. I am running 32 bit Windows Vista and I know that there can be issues that prevent it from making use of a full 4GB which I believe relate to the fact that memory mapped devices, e.g. graphics card, also need to be allocated addresses in the 4GB range addressable with 32 bits. Consequently I was looking at device manager - resources by connection - memory to see what devices where allocated what memory addresses. I was surprised to see that there was an entry for [80000000 - F3FFFFFF] PCI bus. That is a 1.8GB range of addresses. When I expanded it the only thing in it was [E0000000 - EFFFFFFF] NVIDIA GeForce 8600M GT which is only 256MB. So my question is does the PCI bus really occupy 1.8GB of address space and will it prevent my computer from making use of any more memory than it already has.

    Read the article

  • Calling NSFetchedResultsController & CoreData experts

    - by JK
    I am having a few nagging issues with NSFetchedResultsController and CoreData, any of which I would be very grateful to get help on. Issue 1 - Updates: I update my store on a background thread which results in certain rows being delete, inserted or updated. The changes are merged into the context on the main thread using the "mergeChangesFromContextDidSaveNotification:" method. Inserts and deletes are updated properly, but updates are not (e.g. the cell label is not updated with the change) although I have confirmed the updates to come through the contextDidSaveNotifcation, exactly like the inserts and deleted. My current workaround is to temporarily change the staleness interval of the context to 0, but this does not seem like the ideal solution. Issue 2 - Deleting objects: My fetch batch size is 20. If an object is deleted by the background thread which is in the first 20 rows, everything works fine. But if the object is after the first 20 rows and the table is scrolled down, a "CoreData could not fulfill a fault" error is raised. I have tried resaving the context and reperforming the frc fetch - all to no avail. Note: In this scenario, the frc delegate method "didChangeObject...." is not called for the delete - I assume this is because the object in question had not been faulted at that time (as it is was outside the initial fetch range). But for some reason, the context still thinks the object is around, although is has been deleted from the store. Issue 3 - Deleting sections : When the deletion of a row leads to the deletion of a section, I have gotten the "invalid number of rows in section???" error. I have worked around this by removing the "reloadSection" line from the NSFetchedResultsChangeMove: section and replacing it with "[tableView insertRowsAtIndexPaths...." This seems to work, but once again, I am not sure if this is the best solution. Any help would be greatly appreciated. Thank you!

    Read the article

  • What else can I do to secure my Linux server?

    - by eric01
    I want to put a web application on my Linux server: I will first explain to you what the web app will do and then I will tell you what I did so far to secure my brand new Linux system. The app will be a classified ads website (like gumtree.co.uk) where users can sell their items, upload images, send to and receive emails from the admin. It will use SSL for some pages. I will need SSH. So far, what I did to secure my stock Ubuntu (latest version) is the following: NOTE: I probably did some things that will prevent the application from doing all its tasks, so please let me know of that. My machine's sole purpose will be hosting the website. (I put numbers as bullet points so you can refer to them more easily) 1) Firewall I installed Uncomplicated Firewall. Deny IN & OUT by default Rules: Allow IN & OUT: HTTP, IMAP, POP3, SMTP, SSH, UDP port 53 (DNS), UDP port 123 (SNTP), SSL, port 443 (the ones I didn't allow were FTP, NFS, Samba, VNC, CUPS) When I install MySQL & Apache, I will open up Port 3306 IN & OUT. 2) Secure the partition in /etc/fstab, I added the following line at the end: tmpfs /dev/shm tmpfs defaults,rw 0 0 Then in console: mount -o remount /dev/shm 3) Secure the kernel In the file /etc/sysctl.conf, there are a few different filters to uncomment. I didn't know which one was relevant to web app hosting. Which one should I activate? They are the following: A) Turn on Source Address Verification in all interfaces to prevent spoofing attacks B) Uncomment the next line to enable packet forwarding for IPv4 C) Uncomment the next line to enable packet forwarding for IPv6 D) Do no accept ICMP redirects (we are not a router) E) Accept ICMP redirects only for gateways listed in our default gateway list F) Do not send ICMP redirects G) Do not accept IP source route packets (we are not a router) H) Log Martian Packets 4) Configure the passwd file Replace "sh" by "false" for all accounts except user account and root. I also did it for the account called sshd. I am not sure whether it will prevent SSH connection (which I want to use) or if it's something else. 5) Configure the shadow file In the console: passwd -l to lock all accounts except user account. 6) Install rkhunter and chkrootkit 7) Install Bum Disabled those services: "High performance mail server", "unreadable (kerneloops)","unreadable (speech-dispatcher)","Restores DNS" (should this one stay on?) 8) Install Apparmor_profiles 9) Install clamav & freshclam (antivirus and update) What did I do wrong and what should I do more to secure this Linux machine? Thanks a lot in advance

    Read the article

  • Switch to switch encryption over a wireless bridge (TrustSec?)

    - by metatheorem
    I am planning to connect an existing Cisco 3750 switch to a 3560C switch over a wireless PTP bridge. The bridge will be WPA2 protected, but I am looking for an additional measure of security between the switches to prevent other wireless access through either switch. They do not support IPSec, only 802.1Q tunnels, and buying additional hardware is not likely an option. I am looking into using TrustSec manual mode between the switches. After some effort reading into TrustSec and MACsec, I am mostly certain this is a good choice over the wireless bridge, keeping in mind it is a shared medium. Two questions: Can I reliably prevent other wireless traffic from accessing the switches using TrustSec? Does anyone know of any better options with the 3000 series switches?

    Read the article

  • Web-Server directory permissions

    - by MLS
    Hello All, I would like some help understanding web-server directory permissions. Apache, CentOS, PHP, Mysql Example, I have multiple sites in /var/www/html They are in paths like: /var/www/html/www_domainname_com inside each site I might have a path like /lib/mysql/ like PHP connect stuff, database config, etc. What should me permissions be so that someone cannot just browse to that directory? Should I just .htaccess them? I have apache:apache as the owner of all my web directories. Can I prevent someone from crawling certain directories of my web-server? I have a robots.txt, but what is to say the crawler obeys it? So to sum up: 1. What is the best owner/permission set for my sensitive files that the web-server or php or mysql needs, but I dont want people browsing to? Can I prevent straight out crawling of portions?

    Read the article

  • Remote Debian System Preventing Logon

    - by choobablue
    I have a dozen or so single board computers on a network running Debian (squeeze) and access them via ssh (ssh server is dropbear). To give an idea of the hardware of these computers they're 1.2 GHz x86 processors, 1GB of RAM and 4GB flash drives formatted as ext2 (I avoided ext3 to prevent the added flash write stress from journaling), there is also a swap partition on the drive. Normally the setup I'm using works great and I can access all the computers. Every once in a while one will prevent access. What happens is I try to connect via ssh (putty) and it gives me the login prompt, I enter the username and password and it responds 'Access Denied' and it will also refuse any public key in ~/.ssh/authorized_keys. The credentials are correct as they worked previously. The computer responds to pings and putty recognizes the server public key, which implies to me the system is still running. Restarting the server fixes the problem and I can log in again. (I tried a temporary fix of putting shutdown -r now in the root crontab but this doesn't seem to reliably be run once the hang happens) Once I restart however there doesn't seem to be any information in any of the system logs to indicate what happened, the logs are simply empty for that time period, as if the system had crashed. There is some custom software running on the system which appears to stop working (which is why I wanted to ssh to begin with). I'm assuming that this program is the source of the problems but I'm unsure of how it would cause it and how to debug what is happening. The most likely explanation I can think of is that there is a memory leak in the other program that then prevents dropbear from spawning a new login shell (and crontab from executing shutdown) as there is not enough free memory. But looking at memory usage of the other (working) computers there doesn't seem to be any meaningful increase in memory to indicate a leak (unless it's a very big, fast acting and rare leak). I would think that when the OS ran out of memory it would restart the system or kill processes (the Linux kernel restarts right?). The other thing I wonder about is if the fact that they are running off a flash drive could have some effect, especially the swap partition (which I think I should remove to prevent wear of the flash), but the flash drives are young (~1 month) and I don't think that wear would be a factor yet. Does anybody have an idea of what could cause these symptoms, if it could be done by a memory leak, or something else I haven't thought of. And does anybody know of a method to try to debug the problem and find out more information about what's going wrong?

    Read the article

  • Preventing an Apache 2 Server from Logging Sensitive Data

    - by jstr
    Apache 2 by default logs the entire request URI including query string of every request. What is a straight forward way to prevent an Apache 2 web server from logging sensitive data, for example passwords, credit card numbers, etc., but still log the rest of the request? I would like to log all log-in attempts including the attempted username as Apache does by default, and prevent Apache from logging the password directly. I have looked through the Apache 2 documentation and there doesn't appear to be an easy way to do this other than completely preventing logging of these requests (using SetEnvIf). How can I accomplish this?

    Read the article

  • JQUERY - how to get updated value after ajax removes data from within it?

    - by Brian
    I have a an element with thumbnails. I allow users to sort their display order (which fires off an update to the DB via ajax). I also allow them to delete images (which, after deletion, fires off a request to update the display order for all remaining images). My problem is with binding or live I think, but I don't know where to apply it. The array fired off upon delete contains ALL the ids for the images that were there on page load. The issue is that after they delete an image the array STILL contains the original ids (including the one that was deleted) so it is obviously not refreshing the value of the element after ajax has removed things from inside it. I need to tell it to go get the refreshed contents... From what I have been reading, this is normal but I don't understand how to tie it into my routine. I need to trigger the mass re-ordering after any deletion. Any ideas gurus? $('a.delimg').click(function(){ var parent = $(this).parent().parent(); var id = $(this).attr('id'); $.ajax({ type: "POST", url: "../updateImages.php", data: "action=delete&id=" + id, beforeSend: function() { parent.animate({'backgroundColor':'#fb6c6c'},300); $.jnotify("<strong>Deleting This Image & Updating The Image Order</strong>", 5000); }, success: function(data) { parent.slideUp(300,function() { parent.remove(); $("#images ul").sortable(function() { //NEEDS TO GET THE UPDATED CONTENT var order = $(this).sortable("serialize") + '&action=updateRecordsListings'; $.post("../updateImages.php", order, function(theResponse){ $.jnotify("<strong>" + theResponse + "</strong>", 2000); }); }); }); } }); return false; }); Thanks for any help you can be.

    Read the article

  • Build .deb package from source, without installing it

    - by Mechanical snail
    Suppose I have an installer program or source tarball for some program I want to install. (There is no Debian package available.) First I want to create a .deb package out of it, in order to be able to cleanly remove the installed program in the future (see Uninstalling application built from source, If I build a package from source how can I uninstall or remove completely?). Also, installing using a package prevents it from clobbering files from other packages, which cannot be guaranteed if you run the installer or sudo make install. Checkinstall From reading the answers there and elsewhere, I gather the usual solution is to use checkinstall to build the package. Unfortunately, it seems checkinstall does not prevent make install from clobbering system files from other packages. For example, according to Reverting problems caused by checkinstall with gcc build: I created a Debian package from the install using sudo checkinstall -D make install. [...] I removed it using Synaptic Package Manager. As it turns out, [removing] the package checkinstall created from make install tried to remove every single file the installation process touched, including shared gcc libraries like /lib64/libgcc_s.so. I then tried to tell checkinstall to build the package without installing it, in the hope of bypassing the issue. I created a dummy Makefile: install: echo "Bogus" > /bin/qwertyuiop and ran sudo checkinstall --install=no. The file /bin/qwertyuiop was created, even though the package was not installed. In my case, I do not trust the installer / make install to not overwrite system files, so this use of checkinstall is ruled out. How can I build the package, without installing it or letting it touch system files? Is it possible to run Checkinstall in a fakechrooted debootstrap environment to achieve this? Preferably the build should be done as a normal user rather than root, which would prevent the process from overwriting system files if it goes wrong.

    Read the article

  • Unable to uninstall maas completely

    - by user210844
    I'm not able to uninstall MAAS sudo apt-get purge maas ; sudo apt-get autoremove Reading package lists... Done Building dependency tree Reading state information... Done Package 'maas' is not installed, so not removed 0 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. 2 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Setting up maas-region-controller (1.2+bzr1373+dfsg-0ubuntu1) ... Considering dependency proxy for proxy_http: Module proxy already enabled Module proxy_http already enabled Module expires already enabled Module wsgi already enabled sed: -e expression #1, char 91: unterminated `s' command dpkg: error processing maas-region-controller (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of maas-dns: maas-dns depends on maas-region-controller (= 1.2+bzr1373+dfsg-0ubuntu1); however: Package maas-region-controller is not configured yet. dpkg: error processing maas-dns (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: maas-region-controller maas-dns E: Sub-process /usr/bin/dpkg returned an error code (1) Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 2 not upgraded. 2 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Setting up maas-region-controller (1.2+bzr1373+dfsg-0ubuntu1) ... Considering dependency proxy for proxy_http: Module proxy already enabled Module proxy_http already enabled Module expires already enabled Module wsgi already enabled sed: -e expression #1, char 91: unterminated `s' command dpkg: error processing maas-region-controller (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of maas-dns: maas-dns depends on maas-region-controller (= 1.2+bzr1373+dfsg-0ubuntu1); however: Package maas-region-controller is not configured yet. dpkg: error processing maas-dns (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: maas-region-controller maas-dns E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >