Wikipedia:Reference desk/Archives/Computing/2010 September 23

From Wikipedia, the free encyclopedia
Computing desk
< September 22 << Aug | September | Oct >> September 24 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 23[edit]

Replacing software[edit]

I bought a new laptop which came with:

  • Microsoft Works
  • Microsoft Office 2007 (60 day trial)
  • Norton Internet Security (60 day trial)

I also purchased a copy of Norton 360 and a copy of Microsoft Office 2010 Student Edition and would like to know the following:

  • Before installing Office 2010 do I have to uninstall Office 2007
  • Do I have to uninstall Microsoft Works (as I believe I've heard that they will clash with each other, although I do realise that as I said they bothe came together on this computer which suggests that they don't)
  • Do I have to uninstall Norton Internet Security before installing Norton 360
  • To install both Office 2010 and Norton 360 do I actually need to use the discs or can I use the product keys to update everything or something like that

--81.23.48.100 (talk) 01:48, 23 September 2010 (UTC)[reply]

That's what I would do, just in case. I would uninstall all of those programs to prevent the possibility of any conflicts. Doing so will also free up disk space and clean up the registry. I haven't used Office 2010 much, but I did try to run Office 2007 and Office 2003 on the same computer and I had all kinds of issues. It took many hours of work before I gave up and just removed Office 2003. It'd also be a disaster if both Norton Internet Security and Norton 360 started automatically whenever you started your computer, right? Norton is a huge resource hog, and it always messes with programs on your computer, often preventing them from working properly. So, one fat nanny is probably more than enough in this case. Also, Works has fewer features than Office, so there's no reason to keep it.--Best Dog Ever (talk) 02:14, 23 September 2010 (UTC)[reply]

Just make sure you get the Norton Removal Tool before you try uninstalling any Norton software. ¦ Reisio (talk) 04:36, 23 September 2010 (UTC)[reply]

I've never heard of any problems between Microsoft Works and Office. Obviously, you probably won't be using Works much, but it may have some features that the student edition of Office 2010 doesn't (A calendar and database, for example). Unless you really need the extra disk space, or it does create some sort of problem, I wouldn't bother getting rid of it. Buddy431 (talk) 14:46, 23 September 2010 (UTC)[reply]
I've been running Microsoft Works along with Microsoft Office for three years and have never had a problem. Dbfirs 21:20, 23 September 2010 (UTC)[reply]
You might have a reason to keep Works if it has features that you need and are not present in Office, but in general I would uninstall all three pre-installed products/trials and install the replacement products using the disks that I should have got when I bought them. If you don't have install disks and instead are expected to use up your internet bandwidth and your own time to download them, do that before uninstalling stuff. And if downloading, only download from the proper, official site and not some random torrent you found somewhere. Astronaut (talk) 20:08, 24 September 2010 (UTC)[reply]

Internet[edit]

Occasionally a website will become unreachable from my imternet connection, but the site is fully up and operation (as checked via proxy and downforeveryoneorjustme.com). I've tried flushing the dns which has no effect, and browsing to the sites ip address also doesn't work. Why does this happen? Is it a problem with my computer, the isp, something else? 82.44.55.25 (talk) 17:34, 23 September 2010 (UTC)[reply]

That's hard for anyone else to tell... you can figure out a lot by using the proper network diagnostics (eg traceroute) but without such information, it's anyone's guess. If it's a popular site, and you're on a big ISP, and noone else on twitter is complaining, it's probably something on your end. Aside from that, browsing by IP address rarely works nowadays due to Virtual hosting being used for most websites. If you can find someone in the same neighbourhood using the same ISP, you could compare whether they have similar problems. Unilynx (talk) 18:01, 23 September 2010 (UTC)[reply]
My router sometimes refuses to load pages from www.bbc.co.uk and I have to turn it off and on to get there. --Phil Holmes (talk) 18:06, 23 September 2010 (UTC)[reply]
Your IP looks kind of familiar, I think we've answered a lot of your questions regarding wget before, and what you asked suggests that you were trying to massively copy the content of a site that is not under your control. (This may or may not be legal, and we don't give legal advice here. I'm just saying you might want to check with a qualified person if what you're doing is legal.) A site owner that doesn't want to have her/his site "scraped" may try to keep you out using a robots.txt file - but if that fails, because you're ignoring the request in that file, s/he might put a temporary or permanent block on your IP, denying you access to her/his site. -- 78.43.71.155 (talk) 17:08, 24 September 2010 (UTC)[reply]

Timeout[edit]

The default timeout in wget is 900 seconds. That seems very long to me; usually if a site doesn't respond in 10 seconds it isn't going to. Would lowering the timeout to 10 seconds negatively affect wgets functioning? 82.44.55.25 (talk) 22:51, 23 September 2010 (UTC)[reply]

According to man wget, you may find the following options of interest:
-t number
--tries=number
Set number of retries to number. Specify 0 or inf for infinite retrying. The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried.
-T seconds
--timeout=seconds
Set the network timeout to seconds seconds. This is equivalent to specifying --dns-timeout, --connect-timeout, and --read-timeout, all at the same time.
When interacting with the network, Wget can check for timeout and abort the operation if it takes too long. This prevents anomalies like hanging reads and infinite connects. The only timeout enabled by default is a 900-second read timeout. Setting a timeout to 0 disables it altogether. Unless you know what you are doing, it is best not to change the default timeout settings.
All timeout-related options accept decimal values, as well as subsecond values. For example, 0.1 seconds is a legal (though unwise) choice of timeout. Subsecond timeouts are useful for checking server response times or for testing network latency.
--dns-timeout=seconds
Set the DNS lookup timeout to seconds seconds. DNS lookups that don’t complete within the specified time will fail. By default, there is no timeout on DNS lookups, other than that implemented by system libraries.
--connect-timeout=seconds
Set the connect timeout to seconds seconds. TCP connections that take longer to establish will be aborted. By default, there is no connect timeout, other than that implemented by system libraries.
There is a lot more related to timeouts and traffic. Just check the man page. -- kainaw 02:04, 24 September 2010 (UTC)[reply]
I've read the manual already but that doesn't answer my question. That tells me how to set the timeout settings, it doesn't tell me if setting the timeout to 10 seconds will negatively affect wgets functioning 82.44.55.25 (talk) 11:04, 24 September 2010 (UTC)[reply]
Lowering the timeout will make it timeout quicker. What else do you expect it to do? -- kainaw 12:14, 24 September 2010 (UTC)[reply]
Clearly, He's wondering why it's set so damn high in the first place!
( The implication being that If there's a good reason he hasn't thought of, he won't mess with it.)
I'm afraid, I don't have a good answer, but it looks like it's been that way for some time. My guess would be that it's there for some historical reason. Personally, I don't recall ever having had an issue setting that to 30 and not worrying about it. APL (talk) 13:22, 24 September 2010 (UTC)[reply]
I have found it is often better to set a smaller timeout with more retries, particularly when the link is lossy and overloaded. If yo are watching it you can abort it and redo it with the -c option to continue from where it got upto (if the web site supports it). Graeme Bartlett (talk) 08:31, 26 September 2010 (UTC)[reply]

What is something like "javascript:OpenDetailWin('<value>')" called[edit]

I'm writing a doc page for {{cite gns}}. I need to identify (use a name for) something like "javascript:OpenDetailWin('<value>')". I need to say "a ??? will be found in the URL box". The government can't make anything easy and so the documentation is a bit convoluted. If you look at the doc page, you should know that I am still trying to find a easier way of locating the GUID (id number). –droll [chat] 23:14, 23 September 2010 (UTC)[reply]

How about "a Javascript function"? --Mr.98 (talk) 23:27, 23 September 2010 (UTC)[reply]
Thanks, That's what I guessed but I hadn't a clue really. –droll [chat] 00:07, 24 September 2010 (UTC)[reply]
This format is sometimes called a "Javascript protocol" URL. It's not really a protocol but the "javascript:" tag is where the protocol is usually specified in a URI. --Sean 16:02, 24 September 2010 (UTC)[reply]